X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Archive for the 'e-Assessment' Category

Randomising Questions and Variables with Moodle Quiz

By Eliot Hoving, on 8 December 2020

One of the strengths of Moodle Quizzes is the ability to randomise questions. This feature can help deter student collusion.

There are several ways to randomise questions in a Moodle Quiz, which can be combined or used separately. Simple examples are provided here but more complex questions and variables can be created. 

Randomising the Question response options

It’s possible to shuffle the response options within many question types, including Multiple Choice Questions and Matching Question. When setting up a Quiz, simply look under Question behaviour, and change Shuffle within questions to Yes 

Randomising the order of Questions

You can also randomise the order of questions in a Quiz. Click Edit Quiz Questions in your Quiz, click the Shuffle tick box at the top of the page. Questions will now appear in a random order for each student. 

Randomising the Questions

It’s possible to add random questions from pre-defined question Categories. Think of Categories as containers of Quiz questions. They can be based on topic area, e.g. ‘Dosage’, ‘Pharmokinetics‘, ‘Pharmacology’, ‘Patient Consultation’ or they can be based on questions for a specific assessment e.g. ‘Exam questions container 1’, ‘Exam questions container 2’, ‘Exam questions container 3’. 

The first step is to create your Categories.

Then when adding questions to your Quiz, select add a random question and choose the Category. You can also choose how many random questions to add from the Category. 

Under the Add a question option in Moodle Quiz, you can select Add a random question.

For example, if you had a quiz of 10 questions, and you want to give students a random question out of 3 options for each question, you would need 10 Categories, each holding 3 questions e.g. ‘Exam Q1 container’, ‘Exam Q2 container’ … ‘Exam Q10 container’. 

Alternatively, if you want a quiz with 10 questions from ‘Pharmokinetics‘, and 10 from ‘Pharmacology’ you could create the two Categories with their questions, then go to your Quiz and add a random question, select the ‘Pharmokinetics‘ Category, and choose 10 questions. Repeat for ‘Pharmacology’. You now have a 20 question quiz made up of 50%  Pharmokinetics and Pharmacology questions.  

After saving a random question/s you can add further random questions or add regular questions that will appear for all students. Simply add a question from the question bank as normal.  

Be aware, that randomising questions will reduce the reliability of your Moodle Quiz statistics. For example the discrimination index will be calculated on the Quiz question overall, e.g. Q2, not on each variation of the question that may have been randomly selected from, i.e. all the questions from the Exam Q2 container. Each question variation will have fewer attempts compared to if the question was given to all students, so any analytics based on these fewer attempts will be less accurate.  

Randomising variables within Questions:

In addition to randomising questions, certain question types can have randomised variables within them. 

The STACK question type supports complex mathematical questions, which can include random variables. For example you could set some variables, a and b, as follows:

a = rand(6)  (where rand(6) takes a random value from the list [0,1,2,3,4,5,6]).

b = rand(2)  (where rand(2) takes some random value from the list [0,1,2]).

Variables can then be used within questions, so students could be asked to integrate a×xb which thanks to my random variables will generate 21 different questions for my students e.g. integrate 0×x0, 0×x, 0×x2, x0, x, x2, 2x0, 2x, 2x2 …  5x0, 5x, 5x2, 6x0, 6x, 6x2.

Random variants can be generated, tested, and excluded if they are inappropriate, in the above case I might exclude a = 0 as the question equation would evaluate to 0, whereas I want students to integrate a non-zero algebraic expression.  

The Calculated question type also supports randomising variables as well as  basic calculation questions for maths and scientific assessment. Calculated questions can be free entry or multiple choice. For example you could ask students to calculate the area of a rectangle. The width and height would be set to wild card values, let’s call them

{w} for width, and

{h} for height.

The answer is always width × height or {w} × {h} regardless of the values of {w} and {h}. Moodle calls this the answer formula.

The tutor then sets the possible values of {w} and {h} for the student by creating a dataset of possible values for Moodle to randomly select from. To create your dataset, you first define your wild card values e.g. {w} will take some value between 1 and 10, and {h} to take some value from 10 to 20.  You can then ask Moodle to generate sets of your variables, e.g. 10, 50, or 100 possible combinations of {w} and {h} based on the conditions you define. For example, given the conditions above, I could generate the following 3 sets: 

Set 1: {w} = 1, {h} = 14 

Set 2: {w} = 6.2, {h} = 19.3 

Set 3: {w} = 9.1, {h} = 11 

Creating a dataset can be somewhat confusing, so make sure you leave enough time to read the Calculated question type documentation and test it out. Once complete, Moodle can now provide students with potentially 100s of random values of {w} and {h} based on your dataset. Using the answer formula, you provide, Moodle can evaluate the student’s response and automatically grade the student’s answer regardless of what random variables they are given. 

Try a Randomised Quiz

To learn more, take an example randomised quiz on the Marvellous Moodle Examples course.

Speak to a Learning Technologist for further support

Contact Digital Education at digi-ed@ucl.ac.uk for advice and further support.

Improve your mathematics and science quizzes with STACK

By Eliot Hoving, on 11 June 2020

The STACK question type is now available in UCL Moodle Quizzes. STACK allows for rigorous mathematical assessment. Until now, mathematical questions often needed to be multiple choice questions, but with the STACK question type, students can enter mathematical responses directly into Moodle.

Students can input equations directly into Moodle and see a preview before submitting.

Figure 1: Students input equations directly into Moodle, and can see a preview before they submit.

STACK questions can have multiple parts, and each part can be evaluated separately. STACK  questions can also include randomly generated components, making it a lot easier to create a range of practice questions, and also preventing student colluding during a quiz.

The feedback options for staff are dramatically enhanced. Student responses can be evaluated against a series of tests, with different feedback and grading returned to students based on the test outcomes. For example a student’s response could be automatically graded to receive a mark of 1 if it is algebraically equivalent to the correct answer, but lose a mark of 0.1 if it is not properly factorised, for a total mark of 0.9. There are many more tests as well.

Student feedback can be tailored to their response.

Figure 2: Student feedback can include tailored responses, equations and even graphical plots.

To learn more about the STACK question type, see the STACK Moodle user guide.

If you are interested in receiving support to introduce STACK into your Moodle quizzes, please contact digi-ed@ucl.ac.uk.

STACK training workshops (various) – sign up now!

The team behind the Stack question type, are currently offering training on STACK from Monday 15th June 2020 10:00-12:00 BST. Reserve your place.

Demonstration course now available at UCL

A Demonstration course which includes multiple question types created in STACK for you to test, analyse and adapt is now available on UCL Moodle. The course is based on the excellent STACK demonstration course provided by the creators of STACK.

To get access, please contact digi-ed@ucl.ac.uk

Images courtesy of the STACK Documentation page.

 

Marking 24 hour exams

By Steve Rowett, on 5 May 2020

This blog post has been re-made for exams in 2021 – visit the Marking centrally managed exams in 2021 blog post.


Please note this post is being regularly updated with additional resources.
+ New on Friday 8 May: Guide to online marking from Mary Richardson
+ New on Friday 8 May: Microsoft Drawboard PDF demo from Dewi Lewis, UCL Chemistry
+ New on Friday 15 May: Updated details on Microsoft Drawboard PDF

The move to online 24 hour assessments that replace traditional exams leads to a challenge for those that have to grade and mark the work.

We start from a place of two knowns:

  • Students are submitting work to Turnitin in Moodle during the 24 hour window; and
  • Final grades need to be stored in Portico, our student records system.

But in between those two endpoints, there are many different workflows by which marking can take place. These are set out by the UCL’s Academic Manual but encompasses a range of choices, particularly in how second marking is completed. One key difference between regular courseworks is that this is not about providing feedback to students, but instead about supporting the marking process, the communication between markers and the required record of the marking process. At the end of the marking process departments will need to ensure that scripts are stored securely but can be accessed by relevant staff as required, much in line with requirements for paper versions over previous years.

Neither SRS nor Digital Education mandate any particular way that marking should take place and there is considerable flexibility for departments to use processes that work best for them. So we are suggesting a menu of options which provide a basis for departments to build on if they so choose. We are also running daily training sessions which as listed at the foot of this document.

The menu options are:

  • Markers review the scripts and mark or annotate them using Turnitin Feedback Studio
  • Digital Education will provide PDF copies of scripts for departments to annotate using PDF annotation software on a computer or tablet device.
  • Markers review the scripts using Turnitin Feedback Studio, but keep a ‘marker file’ or notes and comments on the marking process.
  • Markers print the scripts and mark them, then scan them for storage or keep them for return to the department on paper.

The rest of this post goes into these options in more detail.


Turnitin Feedback Studio

Turnitin Feedback Studio provides a web-based interface where comments can be overlaid on a student’s work. QuickMarks provide a bank of comments that occur regularly that can be just drag and dropped onto the work. In addition, the traditional Turnitin Similarity Report is also available. This method probably works best for text-based documents like essays and reports. Turnitin is integrated into Moodle and set up for you as part of the exam process for students to submit their work, but it’s your choice if you wish to use the marking tools available after the work has been submitted. The Turnitin submission boxes have been set up for you, and we ask that you don’t change the settings or set up any grading templates to mark student work before submission, as this could prevent students from submitting.

You can also allocate marks using grading forms or rubrics.  On the whole we think that these could be a bit of a ‘sledgehammer to crack a nut’ solution for a single paper, but it is an option available to you if you are familiar with them and you have a more granular set marking criteria for each question. We recommend hiding the assignment before adding the grading form or rubric so that students cannot see it.

If you want to know if this method is for you, you can watch a short video demo or  try marking up an example paper provided by Turnitin. A video tailored to UCL’s 24 hour exam process is given below. This video has captions.

Things to think about with this approach:

  • Rubrics and grading forms take a little bit of setting up, and are probably best used where you have previous experience in doing them.
  • In some exams it is common to put a mark (e.g. a tick) on each page to indicate that the page has been read. To replicate this you might define a QuickMark called ‘page read’ and put it on each page, or annotate with the same words
  • The marked paper often becomes a resource to go back to if there are any errors or omissions in the grading process. You might wish to both write the marks on the paper using the annotation tools or in the general feedback area, and also lodge them in a spreadsheet for uploading to Portico.
  • Turnitin does not support double blind marking effectively. It is rarely used for paper-based exams (since the second marker could always see the markings of the first marker on the paper) but if it was needed one marker could mark online and the second could download the papers for offline marking (e.g the ‘marker form’ method below).

You can view additional guidance on using Turnitin Feedback Studio.


Annotation using PDF documents

Where you annotation needs are more sophisticated, or you want to ‘write’ on the paper using a graphics tablet or a tablet and pencil/stylus, then this option may suit you better.

Upon notification (notification form) Digital Education will supply your department with PDF copies of the students’ work, uploaded to a OneDrive account set up by your department.

For this to happen, Exam Liaison Officers / Teaching Administrators will need to set up a OneDrive folder and notify Digital Education that they wish to have PDF copies of the files. We have a video tutorial (with captions) on this process below.

You can then use tools you already have or prefer to use to do your marking. There is more flexibility here, and we will not be able to advise and support every PDF tool available or give precise instructions for every workflow used by departments, but we give some examples here.

Marking on an iPad using OneDrive

Many staff have reported using an iPad with Apple Pencil or an Android tablet with a stylus to be a very effective marking tool. The Microsoft OneDrive app supports both platforms and provides rapid access to scripts and some annotation tools as shown in the video below (which also has captions). The OneDrive app is free, and connects to your UCL OneDrive account via Single Sign On.

There’s further guidance from Microsoft on each individual annotation tool.

The Apple Files app can also connect to OneDrive and has a similar (and perhaps more powerful) annotation tool. Thanks to David Bowler for mentioning this in the first comment on this blog post.

Marking on a Mac using Preview

Preview on a Mac is often taken for granted but is actually quite a sophisticated tool and includes some basic annotation functions. Here is some guidance from Apple on using it.

Marking on a PC or Surface Pro using Microsoft Drawboard PDF

Microsoft Drawboard PDF is a very comprehensive annotation tool, but is only available for Windows 10 and is really designed to be used with a Surface Pro or a desktop with a graphics tablet. Dewi Lewis from UCL Chemistry has produced a video illustrating the annotation tools available and how to mark a set of files easily. UCL does not have a site-wide licence for Drawboard PDF, but it is available at a very modest price if departments choose to buy it.

Marking on a PC, Mac or Linux machine using a PDF annotation program.

Of course there are plenty of third party tools that support annotating PDF documents. Some requirement payment to access the annotation facilities (or to save files that have been annotated) but two that do not are Xodo and Foxit PDF.

Things to think about with this approach:

  • Your marking process: if you use double blind marking you might need to make two copies of the files, one for each marker. If you use check marking then a single copy will suffice.
  • You will need to ensure the files are stored securely and can be accessed by the relevant departmental staff in case of any query. You might share the exam submission files with key contacts such as teaching administrators or directors of teaching.
  • Some of the products listed above have a small charge, as would any stylus or pencil that staff would need. These cannot be supplied centrally, so you may need a process for staff claiming back the costs from departments.

Using a ‘marker file’

Accessing the students’ scripts is done using Turnitin in Moodle, which allows all the papers to be viewed online individually or downloaded in one go. Then a separate document is kept (either one per script, or one overall) containing the marks and marker feedback for each comment. If double-blind marking is being used, then it is easy to see that two such documents or sets of documents could be kept in this way.


Printing scripts and marking on paper

Although we have moved to online submission this year, colleagues are still welcome to print documents and mark on paper. However there is no central printing service available for completed scripts to be printed, and this would have to be managed individually or locally by departments.


The evidence about marking online

In this video Dr Mary Richardson, Associate Professor in Educational Assessment at the IOE, gives a guide to how online marking can differ from paper-based marking and offers some tips for those new to online marking. The video has captions.


Training sessions and support

Digital Education will be running daily training sessions for teachers covering the ground in this blog post. These will run at 12-1pm every weekday from Tuesday 12 May.

No booking necessary.

We are also providing additional support for students during the exam period. Our support hours will be (UK time):

  • Monday 11.30am-8.30pm
  • Tuesday-Thursday 8am-8pm
  • Friday: 7.30am-3.30pm

Details of support mechanisms are given in the exam section on each Moodle module where an exam is taking place.

Late Summer Assessments in Moodle

By Anisa Patel, on 29 May 2019

This year 18/19 sees a significant increase and change in practice to the late summer assessments process:(https://www.ucl.ac.uk/academic-manual/recent-changes/late-summer-assessments-2018-19).

In order to facilitate this change the new 18/19 Snapshot will remain read/write however until the 20th September for the reasons detailed below.

Please note: If you have any assessment due after 20th September 2019, this will NOT apply to you. For example, if you run a Masters Dissertation Module every year this does not fall under this category and you should carry on with your exisiting process by using live Moodle 19-20 for your students submissions.

Change in Practice of Moodle Use and Late Summer Assessments

To facilitate the changing combination of required end of year tasks in combination with the late summer assessments, we now request that all late summer assessments take place within the 18/19 Moodle Snapshot that will be created on the 26th of July 2019.

The reason why we are asking that you follow this guidance is as follows:

  • All associated course content and student/cohort data will remain consistent and associated with the correct Moodle snapshot in this case 18/19
  • Completing late summer assessment within the 18/19 Snapshot allows all the “live” Moodle courses to be reset and normal end of year course activities to take place from the 29th July, so course teams can begin immediately on 19/20 courses
  • Additional Moodle course creation is kept to a minimum within the live Moodle instance, and aids in Moodle housekeeping activities (reducing dead/unwanted courses, improving long term database performance).

What are we doing to facilitate this change?

  • The Moodle 18/19 Snapshot will remain read/write until the 20th September 2019 (2 weeks after the final assessment date for late summer assessments)
  • Digital Education will retain the current two Moodle selection/landing screen however, it will be re-purposed to direct students to the Snapshot Moodle for late summer assessments
  • Digital Education will create a global banner within “live” Moodle directing students to the Snapshot for the duration of the late summer assessment period
  • Digital Education will place other redirection adverts/links within “live” Moodle to highlight to students that late summer assessment activities can be found within the 18/19 Snapshot

How can you prepare for Late Summer Assessments?

If you wish to prepare in advance courses that will have late summer assessments requirements, we recommend the following: –

Signpost in your course that students should be using that Moodle page to complete their late summer assessments.

Within any course where late summer assessments will be taking place create a hidden section and place any material or submission points within that section. This can be done in the current “live” Moodle up until the 26th July 2019 as preparation. Alternatively, it can be done within the 18/19 Snapshot which will be available on the 27th July 2019.

When you are ready to make late summer assessment material/submission points available simply unhide the section within the course on the 18/19 Snapshot.

Details on how to create and hide sections within Moodle can be found in the following miniguide: https://wiki.ucl.ac.uk/x/RgxiAQ

For any questions regarding Moodle and Late Summer Assessments please email digi-ed@ucl.ac.uk.

We also have a page which has some commonly asked questions about Late Summer Assessments which may help:

https://wiki.ucl.ac.uk/display/MoodleResourceCentre/Late+Summer+Assessments+-+2019

 

Turnitin and Moodle Assignment training

By Eliot Hoving, on 7 February 2019

The Digital Education team is running two new training courses, Hands on with Turnitin Assignment, and Hands on with Moodle Assignment. Each session is practical, from a staff and student perspective you will experience the process of submitting, marking, returning marks, engaging with feedback and managing records. Both courses are applicable to Tutors and Course Administrators new to online marking or needing to refresh their knowledge.

Register through the HR Single Training Booking System or follow the links below:
Hands on with Turnitin
Hands on with Moodle Assignment

Email digi-ed@ucl.ac.uk for more information or to inquire about specific training for your Department.

Jisc student digital tracker 2017 and BLE consortium

By Moira Wright, on 10 August 2017

computer-767776_1920UCL participated in the 2017 Jisc Digital Student Tracker Survey as part of a consortium with the Bloomsbury Learning Environment (BLE) made up of SOAS, Birkbeck, LSHTM and RVC. 74 UK institutions ran the tracker with their students collecting 22,593 student responses, while 10 international universities collected an additional 5,000 student responses

We were the only consortium to participate in the survey and had come together as a result of institutional surveys, such as the National Student Survey, meaning that the time available to run it independently was short (a month) and we therefore felt that our individual sample sizes would be too small. We treated the survey as a pilot and advertised a link to it on each College’s Moodle landing page as well as some promotion via social media and the Student Unions. The survey generated 330 responses, which given our constraints was much more than we expected.

The survey comprises five broad areas: Digital access, digital support and digital learning. Most questions were quantitatively recorded, but there were four open questions, which produced qualitative data. We were also able to choose two additional questions to the survey and we selected e-assessment, since that was a previous shared enhancement project (see www.bloomsbury.ac.uk/assessment) and Moodle, since all members of the consortium use the platform for their Virtual Learning Environment (VLE).

Once the survey closed and we had access to the benchmarking report we ran a workshop for representatives from each of the Colleges in July 2017 whereby the results corresponding to the survey’s open questions were analysed in institutional groups, which facilitated interesting discussions over commonalities and potential implications.

Sarah Sherman, the BLE Manager and myself, have been working to produce a report which will examine our collective responses to the survey in comparison with the national survey population with a recommendation that individual Colleges independently analyse their own results in more detail. For confidentiality, each College will be presented with a version of this document, which contains the relevant data for their institution only and not the complete BLE data set. A disadvantage of the consortium approach was that we were not able to benchmark individual Colleges to the survey population as the resources would not allow for this. In the future, the participating Colleges may wish to run the survey individually rather than as part of a collective as it was not possible to conduct deep analysis with this data set. 

markus-spiske-221494

Although the sample size collected by the Bloomsbury Colleges was small and not statistically viable, there is much we can extract and learn from this exercise. For the most part, our collective responses tended to fall within the margins set by the national survey population, which means we are all at a similar phase in our student’s digital capability and development.

You will have to wait for the full report for more information on the UCL data collected but just to whet the appetite you can see the key findings from Jisc in this 2 page report: Student digital experience tracker at a glance .

Finally, you can see this collection of case studies, which features the Bloomsbury Colleges consortium, here.

Please get in touch with me if you would like to get involved (moira.wright @ ucl.ac.uk)

Sarah Sherman and Moira Wright

Jisc/ NUS student digital experience benchmarking tool 

Jisc guide to enhancing the digital student experience: a strategic approach