X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Managing mark release during the marking and assessment boycott

By Marieke Guy and Zaman Wong, on 13 June 2023

This post gives useful information for admin staff on how to manage mark release and mark upload during the Marking and Assessment boycott.

Using AssessmentUCL/Wiseflow

Step 1: Identifying students who have and have not been marked

1.1 Identify students who have been given a final grade:

Students that have been marked and given a final grade can be identified by Administrators (under the Manager role) by downloading the ‘grade export’ report (image below).

  • Student details (candidate number, student ID, names – these columns can be hidden/shown as desired)
  • Students that have submitted / not submitted
  • Students that have been given a final grade (if blank – no grade has been agreed, but marking may have taken place – please see section 1.2)

Guidance to download report.

Marking

1.2 Identify students that have been (by a first or second marker) but not been given a final grade

Administrators should add themselves as Reviewers on their assessments, which will allow them to download a grade sheet which will display a list of candidates and any marks that have submitted by individual markers (including the name of the marker). If you have issues with adding yourself as a Reviewer, please submit a staff query form to request access.

Once you have opened the assessment in the Reviewing tab, you should select the Offline marking option and follow the steps to export the grade sheet:offline marking

The downloaded grade sheet will show you a list of candidates and any marks that have been submitted by first or second markers (highlighted in red in image below):

Greadesheet

Please note that if the Grade column is empty, this means that no grades have been finalised and a Reviewer will need to submit a finalised grade for students that have been marked (this will allow administrators to complete the grade export to Portico in Step 3).

Guidance

Step 2: Allow students without grades/feedback to be marked after the original marking deadline has concluded:

Student grades and feedback are released on the platform under two conditions; once the marking end date has arrived, and the ‘Show final grades’ option has been enabled.

To allow remaining students to be marked, there are two methods (option b is preferable but may be time consuming if dealing with a large no. of students that have yet to be marked):

  1. a) Administrator / Manager can extend the overall marking end date for all students (to allow for further marking to take place). Caveat: this will mean that students who already have a final grade will not be able to view this on the platform, until the extended marking end date has arrived).

Guidance to Extend overall marking end-date.

  1. b) Administrator / Manager can extend the individual marking end dates for only those students who have not yet been marked (this will mean students that have already been marked will be able to see their final grades on the platform, while allowing markers continue further marking for those that have not been marked).

Guidance to extend individual marking end dates.

Step 3. Grade export to Portico

It is recommended to do this once (when there is a full set of grades), however the grade export button can be pushed more than once (Caveat: if administrator pushes the grade export more than once, you may encounter a ‘Fail’ message for students where their grades were previously exported – this error message can be ignored for those students).

Guidance to complete grade export to Portico.

Using Moodle

Student identities for Moodle assignments and Turnitin assignments cannot be revealed and then hidden again.  Each activity type has a different process to acheive this, which is detailed in our Partial mark entry miniguide.

If you have any queries, please contact the Digital Education team with the assignment title and url at:
digi-ed@ucl.ac.uk

HeLF – Electronic Management of Assessment (EMA) – 18th June 2013, Falmer

By Martin Burrow, on 21 June 2013

Some thoughts /notes on Tuesdays meeting.

 

The first presentation was an overview of the HeLF EMA survey. This was a poll of HeLF members about where they thought their institutions are/will be.Desk with paper

(Available at http://www.slideshare.net/barbaranewland/ and the quantitative data is at http://w01.helfcms.wf.ulcc.ac.uk/projects.html)

It was noted that this type of survey only captures respondents ‘best guesses’ about what is going on – so more a confirmation of expectations rather than any hard data. The main point to note was that very few institutions had an institution-wide policy on e-assessment. The survey split out e-assessment into component parts, e-submission, e-marking, e-feedback, and e-return and it was generally agreed that this was a good thing because they all had their own requirements/challenges.

There was not a lot of mention about the drivers for moving more to EMA, but the predominant factor was student expectations (National Student Survey results mentioned). No great clamour from the staff side and I did get the feeling this was one of those things being pushed by the techies.

People that were doing work on implementing EMA were doing some process mapping to allow them to benchmark what was going on and also to inform any policies that were written. The 4 areas mentioned above were split into constituent steps and these were mapped to range of ways/technologies that could be used to complete these steps. Done both for ‘as it stands now’ and ‘where we would like to move to’  This process mapping was generally done on a school by school basis. The resulting data looked pretty useful and this would definately be a starting point for anyone wanting to pilot/encourage EMA.

Discussion about institutional policy revolved around what level it was appropriate to be set at; institution, dept, school etc. How it should sit on the restrictive/encouraging balance, how IT systems integrate with manual/paper based systems, and probably easiest of all, how it should deal with IT system failures – fall back processes, extensions etc.

There was lots of talk about the difficulties in encouraging e-marking, with lots of evidence of markers preferring paper based marking. My personal take on it is that if you enforce e-submission, e-feedback, and e-return, you can leave the marking (notice here I didnt say e-marking) as a ‘black box’ component, up to personal preference to individual markers – with that caveat that however they choose to mark, their output (grades feedback etc) has to be entered back into the system in electronic format. Ways mentioned to encourage e-marking were allocation of hardware (iPads, large or second PC monitor screens) and extended time periods for marking. The was no evidence that any of these had either large or widespread effect on the uptake of e-marking.

Other points to note were that students were very keen on marking/feedback within a published rubric/schema system, and that using such a system also eased the burden on the markers side. Some institutions (University of the Arts) were introducing cross-department, generic marking criteria that could apply to different subjects.

Also, on the wish list side, there was demand from staff and students for a tool where you could see all a student’s feedback for their whole time at the institution, across all courses and submission points.

All in all, it was a nicely informative little session, well worth being present at.

image from ralenhill Flickr