X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Managing mark release during the marking and assessment boycott

By Marieke Guy and Zaman Wong, on 13 June 2023

This post gives useful information for admin staff on how to manage mark release and mark upload during the Marking and Assessment boycott.

Using AssessmentUCL/Wiseflow

Step 1: Identifying students who have and have not been marked

1.1 Identify students who have been given a final grade:

Students that have been marked and given a final grade can be identified by Administrators (under the Manager role) by downloading the ‘grade export’ report (image below).

  • Student details (candidate number, student ID, names – these columns can be hidden/shown as desired)
  • Students that have submitted / not submitted
  • Students that have been given a final grade (if blank – no grade has been agreed, but marking may have taken place – please see section 1.2)

Guidance to download report.

Marking

1.2 Identify students that have been (by a first or second marker) but not been given a final grade

Administrators should add themselves as Reviewers on their assessments, which will allow them to download a grade sheet which will display a list of candidates and any marks that have submitted by individual markers (including the name of the marker). If you have issues with adding yourself as a Reviewer, please submit a staff query form to request access.

Once you have opened the assessment in the Reviewing tab, you should select the Offline marking option and follow the steps to export the grade sheet:offline marking

The downloaded grade sheet will show you a list of candidates and any marks that have been submitted by first or second markers (highlighted in red in image below):

Greadesheet

Please note that if the Grade column is empty, this means that no grades have been finalised and a Reviewer will need to submit a finalised grade for students that have been marked (this will allow administrators to complete the grade export to Portico in Step 3).

Guidance

Step 2: Allow students without grades/feedback to be marked after the original marking deadline has concluded:

Student grades and feedback are released on the platform under two conditions; once the marking end date has arrived, and the ‘Show final grades’ option has been enabled.

To allow remaining students to be marked, there are two methods (option b is preferable but may be time consuming if dealing with a large no. of students that have yet to be marked):

  1. a) Administrator / Manager can extend the overall marking end date for all students (to allow for further marking to take place). Caveat: this will mean that students who already have a final grade will not be able to view this on the platform, until the extended marking end date has arrived).

Guidance to Extend overall marking end-date.

  1. b) Administrator / Manager can extend the individual marking end dates for only those students who have not yet been marked (this will mean students that have already been marked will be able to see their final grades on the platform, while allowing markers continue further marking for those that have not been marked).

Guidance to extend individual marking end dates.

Step 3. Grade export to Portico

It is recommended to do this once (when there is a full set of grades), however the grade export button can be pushed more than once (Caveat: if administrator pushes the grade export more than once, you may encounter a ‘Fail’ message for students where their grades were previously exported – this error message can be ignored for those students).

Guidance to complete grade export to Portico.

Using Moodle

Student identities for Moodle assignments and Turnitin assignments cannot be revealed and then hidden again.  Each activity type has a different process to acheive this, which is detailed in our Partial mark entry miniguide.

If you have any queries, please contact the Digital Education team with the assignment title and url at:
digi-ed@ucl.ac.uk

Moodle STACK Quiz question type: deploying variants to avoid quiz crashing

By Aurelie, on 4 May 2022

Questions in STACK can contain randomly generated elements. A student will be given a random variant of a question generated by a pseudo-random seed.

Why deploy variants?

The tutor is strongly advised to pre-generate and “deploy” variants of a question. Not pre-generating question variants Forces Moodle to generate them on the fly – for quizzes with larger numbers of participants this can cause quizzes to crash/freeze.
When a student attempts the question, they will be given a random selection from the deployed variants.

Other reasons for deploying variants of a question:

  • STACK runs all the question tests on each deployed variant to establish each variant of the question is working. This aids quality control. By using question tests, it is unlikely a student will be given a random variant which does not work correctly.
  • The tutor can decide if each deployed variant appears to be of equal difficulty. The tutor can easily delete variants they do not like.

Caution

  • If an author does not deploy any variants (not advised!) then the student gets any random variant.
  • Questions that don’t use randomisation cannot be deployed explicitly. STACK automatically detects randomisation.

How to deploy question variants

The deployment interface can be found by editing a question and clicking on question tests and deployed variants.

  1. The easiest way to do so is to preview the question
  2. Then click the Question tests & deployed variant link on the top right corner.
  3. Click ‘deploy’ if not already deployed.
  4. Next to Attempt to automatically deploy the following number of variants, enter the number of variants  you would like and click Go.
    (depending on the question and the question note content you may be able to deploy various amount; if possible deploy over 30)
    You can preview results and either exclude variants, or return to the quiz question settings to revise the randomisation you have used in the question.
  5. Check variants as required.
  6. This will show the list of currently deployed variants, and links to undeploy all or a specific variant.
  7. Optionally, click ‘Run all tests on all deployed variants (slow):’ and check/undeploy any variants you don’t want to use.

Limitations

There is currently no way to loop systematically over all variants and deploy them all.

Find more details and advice on using STACK question types on the M57 – STACK online assessment for mathematics and science.

Video assignments in Moodle

By Janice Kiugu, on 4 May 2020

To support alternative assessments and in particular the use of Video assignments, a new Moodle plugin that allows for the submission of video/media files is now available. The plugin is accessed within a Moodle assignment and the key additional step in ensuring students can upload video files is to select the option to submit ‘online text’ when setting up an assignment.

In the short term (May until late summer) the Lecturecast (Echo360) video submission plugin will be installed. Following on from that, the aim prior to the start of the 20/21 academic year will be to deploy the Mediacentral video plugin, which will replace the Lecturecast plugin and provide a fuller, richer integration with the UCL media platform – Mediacentral.

The reason the Lecturecast/Echo360 plugin is being installed first is that the Mediacentral plugin is more complex in its integration with Moodle. It requires significantly more testing than the Echo360 plugin and cannot be deployed in time to support the forthcoming assessment period.

A key feature of the Echo360 plugin is that it facilitates the use of the Echo360 mobile application which can be used to:

  • record and upload material from portable devices such as tablets and mobile phones.
  • view Lecture materials, but only if a user has first accessed the course and recordings via their Moodle course page.

Note: The Echo360 mobile application can only be used by UCL registered email addresses.

Support documentation and guidance is available for staff and students

Video assignment guides

Echo 360 Mobile app guides

Case study and additional resources

New E-Book on Assessment, Feedback and Technology

By Tim Neumann, on 1 November 2017

UCL Digital Education Advisory members contributed to a new Open Access e-book that provides valuable insight into the way technology can enhance assessment and feedback. The book was launched formally on 26th October by Birkbeck College Secretary Keith Harrison, with talks from the editors Leo Havemann (Birkbeck, University of London) and Sarah Sherman (BLE Consortium), three case study authors, and event sponsor Panopto.

Havemann, Leo; Sherman, Sarah (2017): Assessment, Feedback and Technology: Contexts and Case Studies in Bloomsbury. London: Bloomsbury Learning Environment.
View and download from: https://doi.org/10.6084/m9.figshare.5315224.v1

 

The Book

E-Book thumbnail

E-Book Cover

The book is a result of a two-year project on e-assessment and feedback run by the Bloomsbury Learning Environment (BLE), a collaboration between five colleges, including the UCL Institute of Education, on issues around digital technology in Higher Education. It contains three research papers which capture snapshots of current practice, and 21 case studies from the BLE partner institutions and a little beyond, thus including practice from wider UCL.

The three papers focus on

  • the use of technology across the assessment lifecycle,
  • the roles played by administrative staff in assessment processes,
  • technology-supported assessment in distance learning.

The case studies are categorised under the headings:

  • alternative [assessment] tasks and formats,
  • students feeding back,
  • assessing at scale,
  • multimedia approaches, and
  • technical developments.

Seven of the 21 case studies were provided by UCL Digital Education colleagues Jess Gramp, Jo Stroud, Mira Vogel (2), and Tim Neumann (3), reporting on examples of blogging, group assessment, peer feedback, assessment in MOOCs, student presentations at a distance, and the UCL-developed My Feedback Report plugin for Moodle.

 

Why you should read the e-book

Launch Event Photo

BLE E-Book Launch Event

As one of the speakers at the entertaining launch event, I suggested three reasons why everybody involved in Higher Education should read this book, in particular the case studies:

  1. Processes in context:
    The case studies succinctly describe assessment and feedback processes in context, so you can quickly decide whether these processes are transferable to your own situation, and you will get a basic prompt on how implement the assessment/feedback process.
  2. Problems are highlighted:
    Some case studies don’t shy away from raising issues and difficulties, so you can judge for yourself whether these difficulties represent risks in your context, and how these risks can be managed.
  3. Practical tips:
    All case studies follow the same structure. If you are in a hurry, make sure to read at least the Take Away sections of each case study, which are full of tips and tricks, many of which apply to situations beyond the case study.

Overall, this collection of papers and case studies on assessment and feedback is easily digestible and contributes to an exchange of good practice.

 

View and Download the Book

The e-book is an Open Access publication freely available below.

For further information, see ble.ac.uk/ebook.html, and view author profiles at ble.ac.uk/ebook_contributors.html

 

About the BLE:
The Bloomsbury Learning Environment is a collaboration between Birkbeck, London School of Hygiene and Tropical Medicine (LSHTM), Royal Veterinary College (RVC), School of Oriental and African Studies (SOAS),  UCL Institute of Education (IOE), and the University of London with a focus on technologies for teaching and learning, including libraries and administration.
See www.ble.ac.uk for more information.

HeLF – Electronic Management of Assessment (EMA) – 18th June 2013, Falmer

By Martin Burrow, on 21 June 2013

Some thoughts /notes on Tuesdays meeting.

 

The first presentation was an overview of the HeLF EMA survey. This was a poll of HeLF members about where they thought their institutions are/will be.Desk with paper

(Available at http://www.slideshare.net/barbaranewland/ and the quantitative data is at http://w01.helfcms.wf.ulcc.ac.uk/projects.html)

It was noted that this type of survey only captures respondents ‘best guesses’ about what is going on – so more a confirmation of expectations rather than any hard data. The main point to note was that very few institutions had an institution-wide policy on e-assessment. The survey split out e-assessment into component parts, e-submission, e-marking, e-feedback, and e-return and it was generally agreed that this was a good thing because they all had their own requirements/challenges.

There was not a lot of mention about the drivers for moving more to EMA, but the predominant factor was student expectations (National Student Survey results mentioned). No great clamour from the staff side and I did get the feeling this was one of those things being pushed by the techies.

People that were doing work on implementing EMA were doing some process mapping to allow them to benchmark what was going on and also to inform any policies that were written. The 4 areas mentioned above were split into constituent steps and these were mapped to range of ways/technologies that could be used to complete these steps. Done both for ‘as it stands now’ and ‘where we would like to move to’  This process mapping was generally done on a school by school basis. The resulting data looked pretty useful and this would definately be a starting point for anyone wanting to pilot/encourage EMA.

Discussion about institutional policy revolved around what level it was appropriate to be set at; institution, dept, school etc. How it should sit on the restrictive/encouraging balance, how IT systems integrate with manual/paper based systems, and probably easiest of all, how it should deal with IT system failures – fall back processes, extensions etc.

There was lots of talk about the difficulties in encouraging e-marking, with lots of evidence of markers preferring paper based marking. My personal take on it is that if you enforce e-submission, e-feedback, and e-return, you can leave the marking (notice here I didnt say e-marking) as a ‘black box’ component, up to personal preference to individual markers – with that caveat that however they choose to mark, their output (grades feedback etc) has to be entered back into the system in electronic format. Ways mentioned to encourage e-marking were allocation of hardware (iPads, large or second PC monitor screens) and extended time periods for marking. The was no evidence that any of these had either large or widespread effect on the uptake of e-marking.

Other points to note were that students were very keen on marking/feedback within a published rubric/schema system, and that using such a system also eased the burden on the markers side. Some institutions (University of the Arts) were introducing cross-department, generic marking criteria that could apply to different subjects.

Also, on the wish list side, there was demand from staff and students for a tool where you could see all a student’s feedback for their whole time at the institution, across all courses and submission points.

All in all, it was a nicely informative little session, well worth being present at.

image from ralenhill Flickr

 

 

 

 

Santa uses Grademark.

By Domi C Sinclair, on 20 December 2012

Have you ever wondered how Santa manages to grade the naughty and nice list so fast? Well the answer is technology! Just like many academic staff he uses Grademark, and very efficiently at that.

The text accompanying the video, posted by Turnitin on the video sharing site Vimeo, reads:

‘Every December, millions of children around the world write letters to Santa, explaining how they’ve been good boys and girls and letting him know what they want to see under their trees come December 25th.

Over the years, the number of kids sending him letters skyrocket. His mailbox was flooded and he found himself buried in letters, unable to respond to all of them.

One day, a little elf told Santa about Turnitin—how he could use it to accept submissions from the children, check the letters for originality, give immediate feedback, and even use rubrics to help determine if they’ve been naughty or nice. So he gave it a shot.

Share this video with your colleagues, especially the ones that look like they’ve been in an avalanche of essays.’

Watch the video and see how Santa does it.

How Santa grades millions of Christmas letters