X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Moodle STACK Quiz question type: deploying variants to avoid quiz crashing

By Aurelie, on 4 May 2022

Questions in STACK can contain randomly generated elements. A student will be given a random variant of a question generated by a pseudo-random seed.

Why deploy variants?

The tutor is strongly advised to pre-generate and “deploy” variants of a question. Not pre-generating question variants Forces Moodle to generate them on the fly – for quizzes with larger numbers of participants this can cause quizzes to crash/freeze.
When a student attempts the question, they will be given a random selection from the deployed variants.

Other reasons for deploying variants of a question:

  • STACK runs all the question tests on each deployed variant to establish each variant of the question is working. This aids quality control. By using question tests, it is unlikely a student will be given a random variant which does not work correctly.
  • The tutor can decide if each deployed variant appears to be of equal difficulty. The tutor can easily delete variants they do not like.

Caution

  • If an author does not deploy any variants (not advised!) then the student gets any random variant.
  • Questions that don’t use randomisation cannot be deployed explicitly. STACK automatically detects randomisation.

How to deploy question variants

The deployment interface can be found by editing a question and clicking on question tests and deployed variants.

  1. The easiest way to do so is to preview the question
  2. Then click the Question tests & deployed variant link on the top right corner.
  3. Click ‘deploy’ if not already deployed.
  4. Next to Attempt to automatically deploy the following number of variants, enter the number of variants  you would like and click Go.
    (depending on the question and the question note content you may be able to deploy various amount; if possible deploy over 30)
    You can preview results and either exclude variants, or return to the quiz question settings to revise the randomisation you have used in the question.
  5. Check variants as required.
  6. This will show the list of currently deployed variants, and links to undeploy all or a specific variant.
  7. Optionally, click ‘Run all tests on all deployed variants (slow):’ and check/undeploy any variants you don’t want to use.

Limitations

There is currently no way to loop systematically over all variants and deploy them all.

Find more details and advice on using STACK question types on the M57 – STACK online assessment for mathematics and science.

Video assignments in Moodle

By Janice Kiugu, on 4 May 2020

To support alternative assessments and in particular the use of Video assignments, a new Moodle plugin that allows for the submission of video/media files is now available. The plugin is accessed within a Moodle assignment and the key additional step in ensuring students can upload video files is to select the option to submit ‘online text’ when setting up an assignment.

In the short term (May until late summer) the Lecturecast (Echo360) video submission plugin will be installed. Following on from that, the aim prior to the start of the 20/21 academic year will be to deploy the Mediacentral video plugin, which will replace the Lecturecast plugin and provide a fuller, richer integration with the UCL media platform – Mediacentral.

The reason the Lecturecast/Echo360 plugin is being installed first is that the Mediacentral plugin is more complex in its integration with Moodle. It requires significantly more testing than the Echo360 plugin and cannot be deployed in time to support the forthcoming assessment period.

A key feature of the Echo360 plugin is that it facilitates the use of the Echo360 mobile application which can be used to:

  • record and upload material from portable devices such as tablets and mobile phones.
  • view Lecture materials, but only if a user has first accessed the course and recordings via their Moodle course page.

Note: The Echo360 mobile application can only be used by UCL registered email addresses.

Support documentation and guidance is available for staff and students

Video assignment guides

Echo 360 Mobile app guides

Case study and additional resources

New E-Book on Assessment, Feedback and Technology

By Tim Neumann, on 1 November 2017

UCL Digital Education Advisory members contributed to a new Open Access e-book that provides valuable insight into the way technology can enhance assessment and feedback. The book was launched formally on 26th October by Birkbeck College Secretary Keith Harrison, with talks from the editors Leo Havemann (Birkbeck, University of London) and Sarah Sherman (BLE Consortium), three case study authors, and event sponsor Panopto.

Havemann, Leo; Sherman, Sarah (2017): Assessment, Feedback and Technology: Contexts and Case Studies in Bloomsbury. London: Bloomsbury Learning Environment.
View and download from: https://doi.org/10.6084/m9.figshare.5315224.v1

 

The Book

E-Book thumbnail

E-Book Cover

The book is a result of a two-year project on e-assessment and feedback run by the Bloomsbury Learning Environment (BLE), a collaboration between five colleges, including the UCL Institute of Education, on issues around digital technology in Higher Education. It contains three research papers which capture snapshots of current practice, and 21 case studies from the BLE partner institutions and a little beyond, thus including practice from wider UCL.

The three papers focus on

  • the use of technology across the assessment lifecycle,
  • the roles played by administrative staff in assessment processes,
  • technology-supported assessment in distance learning.

The case studies are categorised under the headings:

  • alternative [assessment] tasks and formats,
  • students feeding back,
  • assessing at scale,
  • multimedia approaches, and
  • technical developments.

Seven of the 21 case studies were provided by UCL Digital Education colleagues Jess Gramp, Jo Stroud, Mira Vogel (2), and Tim Neumann (3), reporting on examples of blogging, group assessment, peer feedback, assessment in MOOCs, student presentations at a distance, and the UCL-developed My Feedback Report plugin for Moodle.

 

Why you should read the e-book

Launch Event Photo

BLE E-Book Launch Event

As one of the speakers at the entertaining launch event, I suggested three reasons why everybody involved in Higher Education should read this book, in particular the case studies:

  1. Processes in context:
    The case studies succinctly describe assessment and feedback processes in context, so you can quickly decide whether these processes are transferable to your own situation, and you will get a basic prompt on how implement the assessment/feedback process.
  2. Problems are highlighted:
    Some case studies don’t shy away from raising issues and difficulties, so you can judge for yourself whether these difficulties represent risks in your context, and how these risks can be managed.
  3. Practical tips:
    All case studies follow the same structure. If you are in a hurry, make sure to read at least the Take Away sections of each case study, which are full of tips and tricks, many of which apply to situations beyond the case study.

Overall, this collection of papers and case studies on assessment and feedback is easily digestible and contributes to an exchange of good practice.

 

View and Download the Book

The e-book is an Open Access publication freely available below.

For further information, see ble.ac.uk/ebook.html, and view author profiles at ble.ac.uk/ebook_contributors.html

 

About the BLE:
The Bloomsbury Learning Environment is a collaboration between Birkbeck, London School of Hygiene and Tropical Medicine (LSHTM), Royal Veterinary College (RVC), School of Oriental and African Studies (SOAS),  UCL Institute of Education (IOE), and the University of London with a focus on technologies for teaching and learning, including libraries and administration.
See www.ble.ac.uk for more information.

HeLF – Electronic Management of Assessment (EMA) – 18th June 2013, Falmer

By Martin Burrow, on 21 June 2013

Some thoughts /notes on Tuesdays meeting.

 

The first presentation was an overview of the HeLF EMA survey. This was a poll of HeLF members about where they thought their institutions are/will be.Desk with paper

(Available at http://www.slideshare.net/barbaranewland/ and the quantitative data is at http://w01.helfcms.wf.ulcc.ac.uk/projects.html)

It was noted that this type of survey only captures respondents ‘best guesses’ about what is going on – so more a confirmation of expectations rather than any hard data. The main point to note was that very few institutions had an institution-wide policy on e-assessment. The survey split out e-assessment into component parts, e-submission, e-marking, e-feedback, and e-return and it was generally agreed that this was a good thing because they all had their own requirements/challenges.

There was not a lot of mention about the drivers for moving more to EMA, but the predominant factor was student expectations (National Student Survey results mentioned). No great clamour from the staff side and I did get the feeling this was one of those things being pushed by the techies.

People that were doing work on implementing EMA were doing some process mapping to allow them to benchmark what was going on and also to inform any policies that were written. The 4 areas mentioned above were split into constituent steps and these were mapped to range of ways/technologies that could be used to complete these steps. Done both for ‘as it stands now’ and ‘where we would like to move to’  This process mapping was generally done on a school by school basis. The resulting data looked pretty useful and this would definately be a starting point for anyone wanting to pilot/encourage EMA.

Discussion about institutional policy revolved around what level it was appropriate to be set at; institution, dept, school etc. How it should sit on the restrictive/encouraging balance, how IT systems integrate with manual/paper based systems, and probably easiest of all, how it should deal with IT system failures – fall back processes, extensions etc.

There was lots of talk about the difficulties in encouraging e-marking, with lots of evidence of markers preferring paper based marking. My personal take on it is that if you enforce e-submission, e-feedback, and e-return, you can leave the marking (notice here I didnt say e-marking) as a ‘black box’ component, up to personal preference to individual markers – with that caveat that however they choose to mark, their output (grades feedback etc) has to be entered back into the system in electronic format. Ways mentioned to encourage e-marking were allocation of hardware (iPads, large or second PC monitor screens) and extended time periods for marking. The was no evidence that any of these had either large or widespread effect on the uptake of e-marking.

Other points to note were that students were very keen on marking/feedback within a published rubric/schema system, and that using such a system also eased the burden on the markers side. Some institutions (University of the Arts) were introducing cross-department, generic marking criteria that could apply to different subjects.

Also, on the wish list side, there was demand from staff and students for a tool where you could see all a student’s feedback for their whole time at the institution, across all courses and submission points.

All in all, it was a nicely informative little session, well worth being present at.

image from ralenhill Flickr

 

 

 

 

Santa uses Grademark.

By Domi C Sinclair, on 20 December 2012

Have you ever wondered how Santa manages to grade the naughty and nice list so fast? Well the answer is technology! Just like many academic staff he uses Grademark, and very efficiently at that.

The text accompanying the video, posted by Turnitin on the video sharing site Vimeo, reads:

‘Every December, millions of children around the world write letters to Santa, explaining how they’ve been good boys and girls and letting him know what they want to see under their trees come December 25th.

Over the years, the number of kids sending him letters skyrocket. His mailbox was flooded and he found himself buried in letters, unable to respond to all of them.

One day, a little elf told Santa about Turnitin—how he could use it to accept submissions from the children, check the letters for originality, give immediate feedback, and even use rubrics to help determine if they’ve been naughty or nice. So he gave it a shot.

Share this video with your colleagues, especially the ones that look like they’ve been in an avalanche of essays.’

Watch the video and see how Santa does it.

How Santa grades millions of Christmas letters

Certainty Based Marking Webinar

6 April 2011

Emeritus Professor Tony Gardner-Medwin gave a Webinar presentation on Wednesday 6th April about using Certainty Based Marking (CBM) for both formative self-tests and summative e-exams.

This type of assessment helps students to understand what areas of a topic they really do know and what areas they need to work on by asking them to choose, on a 3 point scale, how confident they are that their answer is correct.

Questions they answer correctly and with high certainty score the most points, while those they answer correctly with low certainty score fewer points. Questions they get wrong are negatively marked in a similar fashion.

The scoring method is best demonstrated in the following table:

Certainty level No reply C=1 C=2 C=3
Mark if correct 0 1 2 3
Penalty if incorrect (T/F Q) 0 0 -2 -6

When used formatively, students can review their marks and focus on reviewing the material where they were either unsure of an answer or confident of their answer, but incorrect.

Certainty Based Marking can also be used for exams. Evidence has shown that exam results evaluated using CBM closely matches ( tending to be slightly higher) than the scores the students would have received if traditional incorrect/correct marking were used. This can be easily compared, because each CBM exam result can be marked in both ways.

Find out more about Certainty Based Marking here: http://www.ucl.ac.uk/lapt

The CBM Webinar presentation will shortly be available here: http://transformingassessment.com

UPDATE: The Webinar and related materials are now available from here: http://www.ucl.ac.uk/~ucgbarg/pubteach.htm