Skip to main contentSkip to breadcrumbsSkip to sub navSkip to doormat
You are here

Digital Exams

Use the "Digital Exams" application to perform digital written exams on MyLEARN. Use the "Digital Exams" application, 

Download overview of guide-content with links: Digital Exams (EN) links.pdf

Depending on the location of the test, you proceed differently:

  • If you use an online examination environment (OPU),, the application is already activated there.
  • If you are using the online course environment of your course, you must first activate the "Digital Exams" application there yourself. To do this, click on "Applications" under "Administrate" and move the slider next to "Digital Exams".

Administrators and students/members see different content:

  • Administrators see in the left side menu the section "Exams" with the addition "Exam administration" (see  Screenshot) Clicking on it takes you to the exam administration menu. There you can create (import, recreate) questions and manage (recreate, unlock, assess) exams. 
  • Students do not see the "Exams" section in the left side menu . This only appears when an exam has been activated for them. From then on, students see the "Exams" area and below it the title of the unlocked exam (see Screenshot). 
  1. Start with creating exam questions / Test each question and edit if necessary / Publish questions you want to use in the exam.
  2. At the end, create an exam / Test the exam and edit settings if necessary / Publish the exam manually or set a opening date.
  3. Conduct the exam / Close the exam manually or have it close automatically
  4. Grade the exam / Create the gradebook structure / Grade by question or enter the exam protocol
  5. Open student submission review / Publish grades in the gradebook / Close student submission review

* per default

At the level of the individual questions

  • Randomization of the order of answer alternatives
  • Random selection of a defined number of answer alternatives
  • Shuffling of partial questions within a short text interaction
  • Random selection of a defined number of partial questions within a short text interaction
  • Individualized questions by adding "substitution values" (currently SWA only)

At the level of the exam

  • Question names (titles) are hidden and replaced by neutral placeholders during the exam*
  • Randomized question sequence
  • Limited number of questions, provided all questions are of equal value
  • Use of random questions from question pools (under development)
  • Deactivation of spell check in text fields
  • Displaying the IP address of the students
  • Do not allow cut & paste (copy & paste)
  • Generate a digital signature with exam submission

For online exams in an OPU

  • Automated online supervision
  • Examination Statement*

For Digital Campus Exams

  • Safe Exam Browser*
  • Privacy screens in front of each workstation
  • Privacy screens on the PC monitors
  • Display of the student's photo directly during the exam*
  • Room invigilators*

Note:

In the medium term, the application "Digital Exams" will replace learning activities in the design of exams as the official exam tool of WU Vienna. In an OPU, there will be only the possibility of exam design with the application "Digital Exams" probably from WS 22/23 onwards. Please note the current qualification offer with regard to exams.

Administration of questions

You can either copy questions into the exam administration (see: How do I copy questions?) or create new questions there for the first time. To create new questions, proceed as follows:

  1. Click on „Exam administration“ in the side menu.
  2. In the top menu, click on "New“, and select the desired question type from the dropdown list:
  3. Fill all mandatory fields and the optional fields as desired and click on "OK“.
  4. Click on "Preview" to test the question from a student perspective.

The following basic settings must be made for each question:

Page name
The name of the page is a unique name of the question, which is not displayed in the exam administration, e.g. "mc13a". 

Title
Optional. The title is the name with which the question will be displayed in the exam administration. The title of the question is not displayed to students during the exam itself. However, students may see the title of the question in the grade book and in the exam review (when student submission review is opened). If you don't add a title the page name will be selected as the question's title.

Created by
The person who created the question is entered automatically. You can always adjust this entry or add further names within the scope of editing processes.

Minutes
Optional. Specify the expected duration of the question here. The sum of the minutes of all questions defines the time displayed in the countdown and is used as a starting point for calculating an extended time budget (e.g. for BeAble students). The number of minutes does not represent a time limit for the answerability of a question. Note: if neither minutes nor points are entered, there will be an error message when creating the grade book structure after the exam. The minutes display can be hidden during the exam.

Points
Optional. Set the maximum achievable points for this question here.  Depending on the question type, the points are calculated automatically or have to be entered manually in the digital examination form by the assessors. Note: if you do not enter any points, then the indication of the minutes will be used as the point value. The points display can be hidden during the exam. 

2 columns
With this setting you determine whether the information is displayed above the answer options (default setting) or next to the answer options. Check "2-column" to position the information to the left of the answer options (see Example 2 columns activated, see Example 2 columns deactivated).

Feedback
Feedback fields can be viewed by students as part of the review process. Use these fields to include sample solutions or rubrics with the questions.

Language
Default setting is en_US (English). The language setting is only relevant for the question if you want to offer two versions of the same question in an exam (once in English and once in German). The language setting (en or de) is part of the unique name of the page and should not be changed once the question has already been assigned to an exam. 

The following options are available:

  • Select a question e.g. to add them to the clipboard for a copying process.
  • Edit and preview a question.
  • Duplicate a question to adapt its content.
  • Check the change log and restore a previous version of this question, if you wish to do so.
  • Check or change the publishing status of the question. Only published questions can be selected for an exam
  • Identify a question by its title. It is recommended to choose the title as meaningful as possible and to add (e.g. prefixes).
  • Check the last change of the question to keep track on the editing process.
  • Delete a question with a click on the trash icon.

You can copy questions into other communities (OPU, courses).

  1. In the left side menu, click on "exam administration".
  2. Select one or several questions using the checkbox  .
  3. In the menu bar, click on "Clipboard“.
  4. In the drop-down menu, select “Add to clipboard”.
  5. Change to the exam administration of the community in which you want to insert the content.
  6. In the clipboard, select "Insert content here...".

Question Types (A-Z)

Description

Question type: open
Assignment: students write code in a predefined programming language (Python, Html or Java, as of August 2021) using a text field. The text field is optimized for entering the code and consists of several lines (see Example)​​​​​​.
Input/instruction: text, video, image, table
Assessment: manually, by the assessor


How to create this question

see Example

  1. Make all the basic settings (see above). 
  2. The time automatically corresponds to the maximum points to be achieved.
  3. Enter the instructions under "Excercise Text". 
  4. Enter html, pyhthon or java as "Question.interaction.language". 
  5. Enter an optional "Feedback".
  6. Confirm all entries with "OK".

Description

Question type: diverse (open and/or closed)
Assignment: Any number of questions can be assigned by drag & drop to a shared input (e.g. a case description). The questions are presented together on one page  (see Example), but are independently graded. Students can enlarge or reduce the field with the instructions.
Input/instruction: for the case-based interaction: text, video, image, table. The format for specifying the assigned questions is diverse.
Assessment: diverse (automated and/or manually)


How to create this question

see Example

  1. Make all the basic settings (see above). 2-columns is already selected here as the recommended default display-setting. 
  2. Select which information should be presented together with the subquestion (see subquestions with al information, see subquestions without information):
    • Minutes display (the suggested minutes per sub-question).
    • Points display (recommended, the individual points per sub-question)
    • Title (recommended, the individual title of the subquestion)
  3. Enter the case description under "Excercise Text". This can be longer and include video and image elements. 
  4. Drag and drop the questions that you want to present together with the input into the left selection field. Make sure that you do not drag any questions here that you would also like to use elsewhere in the exam. Students would otherwise receive this question 2x. 
  5. Enter an optional "Feedback".
  6. Confirm all entries with "OK".

Description

Question type: closed
Assignment: with the question type "MC interaction", you can create multiple choice questions with up to 15 answer alternatives (see Example)..
Input/instruction: text, video, image, table.
Assessment: automatically, according to the selected grading scheme.


How to create this question

see Example

  1. Make all the basic settings (see above). 
  2. Enter the instructions under "Excercise Text". Also indicate here how many answers can be correct (from "none" till "all", depending on your choice and settings)
  3. Choose a "Grading Scheme" (see below).
  4. Enter an answer alternative in the "Answer Option" field and check the "correct" checkbox if the answer is correct. 
  5. Enter an optional "Feedback" for this answer alternative.
  6. Click "Add another" to create more answer alternatives. 
  7. Repeat steps 4-5 until you have created all answer alternatives. You are also welcome to enter more answer alternatives than should be displayed per question (see "Shuffle", below).
  8. Enter an optional "Feedback" for the whole question.
  9. Confirm all entries with "OK".

Shuffle [Level: Answer Alternatives]
Determine if a selection of all entered answer alternatives should be drawn at random, and if answer alternatives should be displayed scrambled.

  • None
    All students will see all answer alternatives entered for the question in a fixed order. The answer alternatives will not be scrambled.  
  • Per User [Recommended].
    The answer alternatives will be scrambled. If you enter a number in the "Show Max." box, then students are randomly shown the selected number of answer alternatives. If you leave the "Show Max" selection field blank, then all answer alternatives will be displayed. Note: The answer alternatives are drawn completely randomly, i.e. students can also get an MC excercise where none of the presented answer alternatives is correct. 
  • Always [only for testing the exam].
    When pre-testing the exam, the selected number of answer alternatives is randomly displayed. The selection changes after each new call of the exam. Thus, the correct selection/presentation of all answer alternatives can be tested once in advance. 

Grading Scheme
All grading schemes are the same with regard of the following:

  • Each question is assigned a maximum number of points (max).
  • There is a number of correct (r) and incorrect (f) answer alternatives.
  • No negative points are assigned.

Ggw (= Equally weighted, default)

  • Each alternative answer has the same weighting and is interpreted as a single-choice question (yes / no question).
  • Number of correctly checked answer alternatives minus the number of incorrectly checked answered alternatives/2, divided through the number of correctly checked answer alternatives plus the number of incorrectly checked answer alternatives (r-f/2)/(r+f).
  • See Example

wi1 (=Standard with partial points, most commonly used)

  • For each correctly checked answer alternative max/correct points are awarded and for each incorrectly checked answer alternative max/false points are deducted.
  • If a question has only one wrong answer alternative (and this was checked), half of the points are deducted, i.e. max/2 and not max/false.
  • See Example

wi2 (=Mixed assessment)

  • For each correctly checked answer alternative max/correct points are awarded and for each incorrectly checked answer alternative max/false points are deducted.
  • If a question has only one wrong answer alternative (and this was checked), half of the points are deducted, i.e. max/2 and not max/false.
  • If a question has only one correct answer alternative, no partial points are awarded for this question. 
  • See Example

exact (=No partial points)

  • The maximum number of points can only be achieved if all answers alternatives of a question have been checked correctly.
  • But as soon as one answer alternative has been checked falsly, 0 points are awarded for this question.
  • See Example

Description

Question type: closed
Assignment: Students arrange given elements (text or numbers) in a vertical order (see Example). It is important to adhere to the correct order. Attention: all elements must be different. 
Input/instruction: text, video, image, table.
Assessment: automatically, according to the selected grading scheme.


How to create this question

see Example

  1. Make all the basic settings (see above). 
  2. Adapt the grading scheme if necessary (see below).
  3. Enter the instructions under "Excercise Text". Note that you also specify here how to rank (from top to bottom or from bottom to top).
  4. Enter the first element from the sequence under "Ordering Elements".
  5. Click on "add another" to make additional entries. Please note the following:
    • The order you make here corresponds to the "correct" order.
    • No elements should look exactly the same (i.e. duplicate).
  6. Enter an optional "Feedback". For example, you can show which order is correct because students cannot yet see this in the exam protocol. 
  7. Confirm all entries with "OK".

Grading scheme

  • exact
    No partial points. The question has been answered correctly, if all elements have been positioned exactly in the correct order. Example 1234: Only the order 1234 is correct and scores points.
  • position
    Partial points are awarded for those elements that have been positioned exactly correctly. Example 1234: If you enter 1432, 2 of 4 elements are correctly positioned, i.e. half of the points are awarded.
  • relative
    Partial points are also awarded for elements that were not positioned in the exact correct position, but that were positioned at the correct distance from one another (e.g. one behind the other). The more relative neighbors that have been correctly positioned, the more partial points are awarded. Example 1234: If you enter 3412, the relative position of two pairs fits. Here, too, half of the points are awarded.

Description

Question type: closed
Assignment: The question type "SC Interaction", Single Choice questions with up to 15 answer alternatives can be created (see Example).
Input/instruction: text, video, image, table.
Assessment: automatically: the points can only be achieved if the correct answer alternative is chosen.


How to create this question

see Example

  1. Make all the basic settings (see above). 
  2. Enter the instructions under "Excercise Text". 
  3. Enter an  answer alternative in the "Answer Option" field and check the "correct" checkbox if the answer is correct. Please note that it is technically possible to mark multiple answer alternatives as "Correct" here. Since this is an SC question, please select only one checkbox. Students also only have the option to select a single answer alternative. 
  4. Enter an optional "Feedback" for this answer alternative.
  5. Click "Add another" to create more answer alternatives. 
  6. Repeat steps 4-5 until you have created all answer alternatives. You are also welcome to enter more answer alternatives than should be displayed per question (see "Shuffle", below).
  7. Enter an optional "Feedback" for the whole question.
  8. Confirm all entries with "OK".

Shuffle [Level: Answer Alternatives]
Determine if a selection of all entered answer alternatives should be drawn at random, and if answer alternatives should be displayed scrambled.

  • None
    All students will see all answer alternatives entered for the question in a fixed order. The answer alternatives will not be scrambled.  
  • Per User [Recommended].
    The answer alternatives will be scrambled. If you enter a number in the "Show Max." box, then students are randomly shown the selected number of answer alternatives. If you leave the "Show Max" selection field blank, then all answer alternatives will be displayed. Note: The answer alternatives are drawn completely randomly, i.e. students can also get an MC excercise where none of the presented answer alternatives is correct. 
  • Always [only for testing the exam].
    When pre-testing the exam, the selected number of answer alternatives is randomly displayed. The selection changes after each new call of the exam. Thus, the correct selection/presentation of all answer alternatives can be tested once in advance. 

Description

Question type: open or closed
Assignment: A composite question format, where up to 15 questions can be displayed combined with each other  (see Example). For each sub-question it can be determined whether students should answer using a number, a word, several words, several lines or a file. An indication of the expected answer format is then displayed for each sub-question. The sub-questions are not weighted.
Input/instruction: for the short text interaction and each sub-question: text, video, image, table, optionally file attachment. 
Assessment: There is a maximum point value of the whole short text interaction and no display of the points achieved at the level of sub-questions. The assessment varies:

  • If no conditions are entered for all sub-questions, the assessment is done manually, by the assessor/s (open). No more than the maximum score can be entered for all sub-questions together.
  • If conditions are entered for all sub-questions, the assessment is calculated automatically (closed). The sub-questions are weighted equally and the maximum point value can be reached.
  • If conditions are entered for some sub-questions and no conditions are entered for some other sub-questions, then only the point value of the closed question formats is calculated automatically. Assessors can manually overwrite this point value in the exam protocol and add the points from the open question formats (up to the maximum point value assigned to the short text interaction). 

How to create this question

see Example

  1. Make all the basic settings (see above). 
  2. If required, set a random selection at the sub-question level (see "Shuffle", below).
  3. Enter the superordinate instructions under "Excercise Text". If the instruction applies to all sub-questions and is somewhat longer, select the 2-column display. 
  4. Create the first sub-question: Enter a question-text in the "Sub-Question" field. 
  5. Define the format of the answer under "Answer" (see screenshot from exam):
    • Number: the field says "Number" and students will get an error message if you try to save an entry that does not contain a numeric value. Possible formats are integers and commas. The percent sign is not allowed. 
    • Single Word: The field says "single word" but students can enter multiple words or numbers.
    • Multiple Words: The field says "Multiple words" but students can also enter a single word or number
    • Multiple Lines: The field says "Multiple lines." For this, also specify in the "Lines" field how many lines should be displayed. 
    • File Upload: Students will see the "Select file" button. 
  6. Under "Correct when..." you can insert conditions for the automatic calculation of the sub-question's assessment (see Automated Essay Scoring).
  7. In the "Solution" field, enter the feedback for the sub-question.
  8. Repeat steps 5-7 by using "Add another" to create more sub-questions. 
  9. Add "Substitution Values" to include variables in questions (currently a SWA-only feature).
  10. Enter an optional "Feedback" for the entire short text interaction.
  11. Confirm all entries with "OK".

Shuffle [Level: Sub-Questions]
Determine if a selection of all entered sub-questions should be drawn at random, and if sub-questions should be displayed scrambled.

  • None
    All students will see all sub-questions entered for the short text interaction in a fixed order. The sub-questions will not be scrambled.  
  • Per User [Recommended].
    The sub-questions will be scrambled. If you enter a number in the "Show Max." box, then students are randomly shown the selected number of sub-questions. If you leave the "Show Max" selection field blank, then all sub-questions will be displayed.
  • Always [only for testing the exam].
    When pre-testing the exam, the selected number of sub-questions is randomly displayed. The selection changes after each new call of the exam. Thus, the correct selection/presentation of all sub-questions can be tested once in advance. 

Description

Question type: open
Assignment: Students write a text into a text field. The size of the text field can be specified by the number of answer lines and answer columns (see Example). These rows and columns only determine the size of the answer box, they have no influence on the maximum length of the answer.
Input/instruction: text, video, image, table, optionally file attachment.
Assessment: manually, by the assessor, with optional assessment support (see Automated Essay Scoring).


How to create this question

see Example

  1. Make all the basic settings (see above). 
  2. Enter the instructions under "Excercise Text". 
  3. Define how large the text field should be displayed to students at the beginning:
    • The number of answer lines determines the height of the text field: Default 10.
    • The number of answer columns determines the width of the text field: Default 60.
  4. Attach a document (e.g. an article) under "Add file" if required. 
  5. Under "Correct when..." you can insert conditions for assiststing in assessment  (ee Automated Essay Scoring).
  6. Enter an optional "Feedback".
  7. Confirm all entries with "OK".

Description

Question type: open
Assignment:  Students upload one or more files as a submission. You determine the maximum number of files allowed. There is no limit to this. The format of the files is not restricted (single file or ZIP file).
Input/instruction: text, video, image, table, optionally file attachment.
Assessment: manually, by the assessor


How to create this question

see Example

  1. Make all the basic settings (see above). 
  2. Enter the instructions under "Excercise Text". Note that you also specify here how many documents are to be uploaded and in what form.
  3. If necessary, attach a document for processing the task under "Add file" (e.g. a document template for writing in). 
  4. Enter an optional "Feedback" (e.g. a rubrics or a sample solution).
  5. Confirm all entries with "OK".

Assessment Aids

AES functions are available for automated assessment of sub-questions and as an assessment aid.

  • If you enter conditions for short text interactions, then you define content conditions that indicate whether an answer is "correct" or "incorrect". If more than one condition is selected, then all conditions must be "correct" for the answer to be graded as "correct". For each of these conditions, you can select whether the answer is case sensitive ("ignore case"). Example:  short text interaction: settings and exam protocol
  • If you enter conditions for text interactions, then these act as an indicator of what content should be displayed to assesrors and how. Correct content (condition "contains" is highlighted in yellow in the exam protocol, incorrect content (condition "contains not") is highlighted in orange. In addition to, for example, the information from feedback fields, this should enable quick orientation in the text of the answers. Example:  text interaction: settings and exam protocol.

Conditions for an automated assessment (of short text interactions) see Example

=

The answer has to be identical with a certain number or word:​​​​
= 100 (the answer must be the number 100)
= house (the answer must be the word „house“)

>

The answer must be greater than a specific number: 
> 100 (the answer must be a number greater than 100)

>=

The answer must be greater than or equal to a specific number: 
>= 100 (the answer must be a number greater than or equal to 100)

<

The answer must be less than a specific number: 
< 100 (the answer must be a number less than 100)

<=

The answer must be less than or equal to a specific number: 
<= 100 (the answer must be a number less than or equal to 100 sein)

one of

The answer must contain exactly one of the terms (not several): 
one of "§ 42 UrhG" §42 (the answer must be "§ 42 UrhG“ or §42)
 

Conditions for an automated assessment (of short text interactions) and for an assessment aid (of text interactions)

contains

The terms specified should appear in the answer and are highlighted in yellow:
contains Rechtsstaat Bundesverfassung (the terms „Rechtsstaat“ and „Bundesverfassung“ are highlighted in yellow)

contains not

The terms specified should not appear in the answer and are highlighted in red:
contains not DSGVO (the term „DSGVO“ is highlighted in orange)

Sample solutions and rubrics in the feedback for the assignment can be viewed by students and assessors alike in the assessment log. In this way, assessors can also use them as a guide when assessing answers in open-ended questions.

You can filter the examination protocols according to various search settings. In addition, you can filter for questions within all examination protocols and use this feature to display the answers of all students (or a defined subset) to a specific question. The assessment based on selected questions facilitates and accelerates the assessment process.

Administration of exams

To create an exam, proceed as follows:

  1. In the left side menu, click on "exam administration“.
  2. In the top menu bar, click on "New" and select "Inclass Exam“ from the dropdown list.
  3. Fill all mandatory fields (see "exam settings" below) and the optional fields as desired and click on "Create Exam".

After clicking on "Create exam" the exam is created but not yet published (=started). Students do not see an exam (see Screenshot). As an admin, you will now see an overview of the selected settings and questions in the exam settings and can test the exam from a student perspective by clicking on the "Testrun" button. The button "Restart“ guides you back to the exam administration.

Specify a name for the exam ("Name of Exam"). This name is displayed to you in the exam administration and to the students in the gradebook. Especially if you have several exams in a course, choose a descriptive name that is as meaningful as possible, for example "Exam Labor Law" or "Exam Labor Law Beable +100". Then make the detailed settings:


QUESTIONS (Detailed settings)

Questions
At "Questions", select the questions that  select the questions that should be assigned to the exam.  To do so, drag the questions with the mouse from the area "Candidates" to the area "Selection". Only the questions that have already been created and released are displayed.

Randomized Item Order
Default setting "Yes". If selected, students will be provided with the exam questions individually in a random order.

Restrict Items
This setting can only be activated, if all questions have exactly the same value of minutes and points.  Enter the number of questions in the field that should be randomly displayed to students (from all questions assigned to the examination). Example: 100 MC questions with 1 minute and 2 points each are assigned to the exam. You enter the value 50. Students receive a (changing) random selection of 50 of the 100 questions. The exam has a total time value of 50 minutes and 100 points to be scored.


DISPLAY OPTIONS (Detailed settings)

Spell check
Default setting "Yes". If selected, students can use the spell checker for questions with multi-line text fields.

Show minutes
Default setting "Yes". If selected, the recommended processing time per examination question is displayed to students.

Show points
Default setting "Yes". If selected, the score (points) that can be achieved is displayed to students. It is recommended not to deactivate this setting.

Show IP address
Default setting "Yes". If selected, students see their own IP address during the exam.

Countdown timer with audio alarm
Default setting "Yes". If selected, students can activate an audio alert on the exam countdown timer. This will generate a short audible beep 1 minute before the countdown expires to remind students to submit the exam on time. The option should be disabled for Digital Campus Exams (see example)


TIME MANAGEMENT (Detailed settings)

Exam Time Window
If the date and time have been entered for the time window of the exam, the exam is automatically released at the time entered and automatically closed again later. Even if a time window has been entered, the test can also be released and closed manually at an earlier point in time. If all exam questions, Attention: the time of the exam questions determines the display of the countdown. However, the countdown is not based on the entered end time of the exam. Make sure that the total time entered per question corresponds to the total time window (e.g. 30 MC questions of 2 minutes each = countdown of 60 minutes, time window 14:00 - 15:00). 

Synchronized Exam
Default setting "No". If it is set to "Yes", then the countdown starts uniformly for everyone at the official start of the exam time (or respectiveley when the exam is "opened")  If students enter the exam late, then they will see an adjusted countdown in each case (e.g. instead of 45 minutes, only 30 minutes). If the setting remains "No", then the countdown (regardless of a person's start time) always starts in full. Recommendation: Change the setting to "Yes".  

Time budget
Default setting "100%". By adjusting the slider, the processing time of the exam can be increased to e.g. 125%, 150% or 200%. The processing time is based on the value entered for each individual examination task in minutes. This feature is intended for handling BeAble exams. 


Security (Detailed settings)

(Online Supervision)
Default setting "No". Important: Do not activate the online-supervision in the exam settings (here). Please activate it instead (and until further notice) at the community level of the OPU (cf. instructions).

Allow Cut and Paste
Default setting "Yes". If selected, students can interst content (text, numbers, code) into the exam via Copy & Paste (Ctrl + C / Ctrl + V). 

Signature 
Default setting "No". If activated, a digital fingerprint is generated during submission, which enables the evaluators to compare the IP address of the person processing the submission with that of the person submitting the submission.       

After the exam has been published it can be accessed immediately by students. After that, students see in the side menu the item "Online Exams" and below the name of the corresponding exam (see screenshot). With a click on the exam students can start working on it. There are two ways to activate a digital exam:

  • If a start time is entered for the exam, the exam is automatically published (and accessible for students) when this start time is reached. You do not have to do anything else. 
  • You can also activate the exam manually. To do so, select "Publish exam". This will publlish the exam at a time of your choice. 

Analogous to the activation of the exam, you also have two options for closing the exam:

  • If an end time has been entered for the exam, it will be closed automatically when this end time is reached. You do not have to do anything else. 
  • You can also close the exam manually. To do so, select "Close exam". This will close the exam at a time of your choosing. 

Please note that even with a set start and end time, manual opening and closing of the exam is still possible. And if the exam was closed too early, you can reopen it by clicking on "Open exam again". However, students who have already submitted the exam cannot work on it again.​​​​​

While the exam is running, an overview page is displayed. Here you can see the selected settings and questions of the exam, as well as the remaining time. By clicking on a participant, you can view his/her current progress and send a message to one or all candidates.

To access the current exam:

  1. Click in "Exam administration"
  2. Select the Exam that is currently in the state "published" and click on "edit"
  3. You will see the remaining time in this exam and have the option to activate the accustic signal to indicate when the time will be over. Moreover you will see the "Submitted Exams" info-box (see Example).

Look at students' individual progress

  1. Under "Submitted Exams" click on "Participants". You will see all exam participants with their working state
    • "initial" (exam has been accessed but not yet saved as a draft)
    • "working" (exam has not yet been submitted)
    • "done" (exam has been submitted)
  2. You also see if (and how long ago) students have saved an answer/a draft. This time is displayed in "Last Modified"
  3. Seconds" shows how long a person has already been working on the exam. This entry updates with each save-activity as draft. 

Reset a students processing status

If students leave the exam environment without handing in the exam, they can always open it again and continue working on it until the exam is finally handed in or closed. However, it can also happen that students accidentally click on "Submit" too early. You can enable these students to access the exam again using a workaround. To do this, proceed as follows:

  1. Under "Submitted exams," click on "Participants*." Select the person with the status "done" and click on the participant* abbreviation (hyperlink) on the left. This will take you to that person's exam log. 
  2. Click on the penultimate number at the top under History. The last number is the submission version. Confirm your selection with "Use displayed version as submission". The person's status is now "in progress" again and the student can continue working until the exam closes. Depending on the status of the caches, it is possible that content changes that have already been made have also been lost when the delivery version is reset. This must be taken into account during this step. 

Send an individual message to one candidate

  1. Click on the name of one person in the state "working". A pop-up opens where you can enter a message (see Example)
  2. Define the urgency (low, medium, high). The urgency defines the colour of the message which will be displayed in the student*s exam.
  3. Click on "Send" to send this message to the individual student.

Send a message to all candidates

  1. Click in "Send Messages to X Participants". A pop-up opens where you can enter a message.
  2. Define the urgency (low, medium, high). The urgency defines the colour of the message which will be displayed in the students' exams.
  3. Click on "Send" to send this message to all students who haven't submitted the exam yet.

To access an exam:

  1. Click in "Exam administration"
  2. Select the Exam you are interested in and click on "edit"

Apart from all the grading options, here you can

  • See an overview on how many persons have submitted the exam (eg. 57/61)
  • Find the "Listing of Filled-out Exams". An overview of all answers entered/selected per student (with CSV-export option)
  • Access the general "Exam Protocol" where you will see an overview of all exam-related activities of each exam participant. Here you can access e.g. the final submission, all former revisions,and the current state of each question (submitted, not submitted).

After the exam has been closed, you can create the exam in the grade book on MyLEARN by clicking the "Create grade book structure" button.

  • Closed questions (MC interactions, SC interactions, Ordering interactions and Short Text Interactions with conditions) are automatically graded and entered in the grade book, but can be changed at any time in the exam protocol.
  • Open questions must be graded manually. To do this, click into the student's exam protocol after creating the grade book structure. Enter the points for each question in the empty field next to the term "Points". You can also add comments.

Points and comments are synchronized with the grade book. Edit the entries in the gradebook entirely as you see fit (see Gradebook) and release the entries at the exam or individual question level as you see fit. 

What can I do if questions need to be corrected?

  1. Click on "Exam administration".
  2. Select the exam in question and click on "Edit". The exam must be closed (status "done").
  3. Click on "Create grade book structure".
  4. Correct assessment settings for questions (e.g. markings as "correct" for SC and MC questions, conditions or assessment schemes). Of course, content changes must not be made afterwards. 
  5. Click on "Recalculate Score". 

Once the exam is closed, you can allow students to view the exam by clicking the button "Open Student Submission Review". This generates a link to the inspection and you have two options:

Clicking on "Close Student Submission Review" closes the student inspection of the exam protocol.

None yet. A first implementation is still being worked on...

How do students navigate the exam?