# Resources¶

A resource is a single Numbas exam, which students access via the VLE.

Resources are automatically created when you launch a new Numbas activity from the VLE as an instructor. You must select a Numbas exam to use, and then any students who launch the same activity will be shown the exam.

## Creating a new resource¶

Either upload an exam package that you have downloaded from the Numbas editor or, if any editor links have been created, select an exam from the list.

When you download an exam package from the Numbas editor, you must use the SCORM package option.

Once you’ve selected an exam, you will be shown the dashboard for the resource.

## Dashboard¶

When you open a resource as an instructor, you are first shown the dashboard. This view lists scores for all students who have attempted the resource.

### Report scores back to VLE¶

The LTI provider can automatically report scores back to the VLE.

Click the Report scores back to VLE button to begin this process. This may take some time; you’ll be shown either a success message or any errors encountered while reporting scores.

This isn’t supported by every VLE.

### Discount question parts¶

Discounting a question part removes it from the score calculations: any marks students have earned for that part are discounted.

You might want to do this if an error is found in a question.

To discount a part, click on the Discount a question part button on the dashboard.

You are shown a list of all the question parts in the exam. Click a Discount this part button next to a part to discount it. You can choose whether to remove the part from the total available for the exam, or to award everyone full marks for the part. These have different effects on the weighting of other parts in the exam - removing a part from the total will increase the weighting of other parts, while awarding full marks will increase everyone’s total score. Think carefully about what you want to do.

If you discount a gapfill part, all of its gaps are discounted. If you discount an individual gap, the other gaps are unaffected.

Click this button to download a .csv file containing the scores for each student who has attempt the resource.

The columns of the file are:

• First name
• Last name
• Percentage score

The values in the name, email address and username fields come from the VLE. The username field might not correspond exactly to the student’s username on the VLE; in particular, Blackboard preprends usernames with cuid:.

### Student progress¶

The Student progress table lists the names of students who have attempted the activity, along with their scores as calculated following the grading method, and the number of attempts they have made.

You can narrow down the displayed list by entering a name in the Search for a student box.

Note that only students who have launched the activity are listed - the LTI provider has no way of knowing about students who have access to the activity through the VLE but have never launched it.

When the number of attempts students are allowed to make is limited, circumstances can arise in which you want to allow particular students another attempt.

To do this, click the plus symbol in the Access tokens column next to the student’s name in the Student progress table.

When the student launches the activity, they will be offered the opportunity to start a new attempt.

## Attempts¶

Click on the Attempts button at the top of the page to view the attempt management screen.

Click the Download attempts summary as CSV button to obtain a .csv file with information on every attempt at this activity.

The columns of the file are:

• First name
• Last name
• Start time, in YYYY-MM-DD HH:MM:SS.ffffff+HH:MM format.
• Completed? (Either completed or incomplete)
• Total score
• Percentage (total score as a percentage of marks available)
• One column giving the total score for each question

### Review an attempt¶

Click the Review button to view a student’s attempt as they saw it.

This is useful when a student queries the mark they were awarded for a part of the exam.

Note that review mode always opens attempts as if they were completed, even if the student has not yet ended the exam.

### Remark an attempt¶

Click the Remark button to manually change the score awarded for a question part.

You are shown a list of every question part in the exam. Click the pencil icon on the row corresponding to the part you want to change, and enter the new score. The new score is saved as you type, and the totals for the question and the whole exam are recalculated automatically.

### SCORM data¶

Numbas uses the SCORM standard to store data about attempts. By clicking on the SCORM data button, you can see all of the SCORM data model elements stored for a particular attempt.

This is most useful for debugging connection errors, to confirm that data has been saved.

If Most recent value only is ticked, only the most recent value for each element is shown. Untick it to see every value that the element has taken since the start of the attempt.

You can type a regular expression in the Search for an element box to narrow down the displayed list of elements.

### Delete an attempt¶

Click the Delete button to delete an attempt. This is permanent; the student will be able to start a new attempt next time they launch the activity.

### Reopen an attempt¶

Sometimes students accidentally close their attempts before they mean to. Click the Reopen button to allow a student to complete their attempt. The next time that they launch the activity, they will be able to resume the attempt as if they had only paused it.

Beware that the standard Numbas settings allow a student to see the correct answers to every question once they have finished their attempt. If you’re concerned about this, it’s often better to make the student start a new attempt, rather than reopen the previous one.

## Settings¶

### Replace exam package¶

If you discover an error in your exam, you can update it by downloading it again from the editor and clicking the Replace exam package button.

Any new attempts will use the latest version of the exam package. Because the new version might have changed in a way that is incompatible with existing attempts, for example by removing or rearranging question parts, any attempts started with the old package will continue to use the old package.

Specify how a student’s score for the activity is calculated.

• “Highest score” will use the highest total score from any of the student’s attempts.
• “Last attempt” will use the total score from the attempt which the student began last.

### Include incomplete attempts in grading?¶

If ticked, incomplete attempts will be included when calculating the student’s score for the activity.

It’s normally good to leave this on, so that students who forget to click the End Exam button won’t be penalised.

### Maximum attempts per user¶

How many attempts at the resource can each user take?

If set to 0, then there is no limit.

### When to show scores to students¶

When a student reopens an activity, they are shown a summary of their attempts. You might not want to immediately show students their scores on this screen.

• “Always” means the student will see scores for all attempts, including incomplete attempts.
• “When attempt is complete” means the student will only see their score for an attempt once it is complete.
• “Never” means that no scores are shown to the student, even after they’ve completed their attempt.

Warning

This only controls the display of scores by the LTI provider. If you want to hide scores from the students, you must also turn off the score feedback options in the exam editor.

### When to report scores back¶

Specify when students’ scores are reported back to the consumer. Some VLEs make reported scores available to students immediately, which you may not want.

• “Immediately” - scores are reported as soon as they change, i.e. whenever a student submits an answer.
• “On completion” - a student’s score is reported when they complete an attempt.
• “Manually, by instructor” - Scores are only reported when an instructor clicks the Report scores back to VLE button on the dashboard.

## Test run¶

Click the Test run button to launch the Numbas exam. Data will not be saved - this feature is solely a convenience for instructors to check the contents of the exam.