X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Video assignments in Moodle

Janice Kiugu4 May 2020

To support alternative assessments and in particular the use of Video assignments, a new Moodle plugin that allows for the submission of video/media files is now available. The plugin is accessed within a Moodle assignment and the key additional step in ensuring students can upload video files is to select the option to submit ‘online text’ when setting up an assignment.

In the short term (May until late summer) the Lecturecast (Echo360) video submission plugin will be installed. Following on from that, the aim prior to the start of the 20/21 academic year will be to deploy the Mediacentral video plugin, which will replace the Lecturecast plugin and provide a fuller, richer integration with the UCL media platform – Mediacentral.

The reason the Lecturecast/Echo360 plugin is being installed first is that the Mediacentral plugin is more complex in its integration with Moodle. It requires significantly more testing than the Echo360 plugin and cannot be deployed in time to support the forthcoming assessment period.

A key feature of the Echo360 plugin is that it facilitates the use of the Echo360 mobile application which can be used to:

  • record and upload material from portable devices such as tablets and mobile phones.
  • view Lecture materials, but only if a user has first accessed the course and recordings via their Moodle course page.

Note: The Echo360 mobile application can only be used by UCL registered email addresses.

Support documentation and guidance is available for staff and students

Video assignment guides

Echo 360 Mobile app guides

Case study and additional resources

New E-Book on Assessment, Feedback and Technology

Tim Neumann1 November 2017

UCL Digital Education Advisory members contributed to a new Open Access e-book that provides valuable insight into the way technology can enhance assessment and feedback. The book was launched formally on 26th October by Birkbeck College Secretary Keith Harrison, with talks from the editors Leo Havemann (Birkbeck, University of London) and Sarah Sherman (BLE Consortium), three case study authors, and event sponsor Panopto.

Havemann, Leo; Sherman, Sarah (2017): Assessment, Feedback and Technology: Contexts and Case Studies in Bloomsbury. London: Bloomsbury Learning Environment.
View and download from: https://doi.org/10.6084/m9.figshare.5315224.v1

 

The Book

E-Book thumbnail

E-Book Cover

The book is a result of a two-year project on e-assessment and feedback run by the Bloomsbury Learning Environment (BLE), a collaboration between five colleges, including the UCL Institute of Education, on issues around digital technology in Higher Education. It contains three research papers which capture snapshots of current practice, and 21 case studies from the BLE partner institutions and a little beyond, thus including practice from wider UCL.

The three papers focus on

  • the use of technology across the assessment lifecycle,
  • the roles played by administrative staff in assessment processes,
  • technology-supported assessment in distance learning.

The case studies are categorised under the headings:

  • alternative [assessment] tasks and formats,
  • students feeding back,
  • assessing at scale,
  • multimedia approaches, and
  • technical developments.

Seven of the 21 case studies were provided by UCL Digital Education colleagues Jess Gramp, Jo Stroud, Mira Vogel (2), and Tim Neumann (3), reporting on examples of blogging, group assessment, peer feedback, assessment in MOOCs, student presentations at a distance, and the UCL-developed My Feedback Report plugin for Moodle.

 

Why you should read the e-book

Launch Event Photo

BLE E-Book Launch Event

As one of the speakers at the entertaining launch event, I suggested three reasons why everybody involved in Higher Education should read this book, in particular the case studies:

  1. Processes in context:
    The case studies succinctly describe assessment and feedback processes in context, so you can quickly decide whether these processes are transferable to your own situation, and you will get a basic prompt on how implement the assessment/feedback process.
  2. Problems are highlighted:
    Some case studies don’t shy away from raising issues and difficulties, so you can judge for yourself whether these difficulties represent risks in your context, and how these risks can be managed.
  3. Practical tips:
    All case studies follow the same structure. If you are in a hurry, make sure to read at least the Take Away sections of each case study, which are full of tips and tricks, many of which apply to situations beyond the case study.

Overall, this collection of papers and case studies on assessment and feedback is easily digestible and contributes to an exchange of good practice.

 

View and Download the Book

The e-book is an Open Access publication freely available below.

For further information, see ble.ac.uk/ebook.html, and view author profiles at ble.ac.uk/ebook_contributors.html

 

About the BLE:
The Bloomsbury Learning Environment is a collaboration between Birkbeck, London School of Hygiene and Tropical Medicine (LSHTM), Royal Veterinary College (RVC), School of Oriental and African Studies (SOAS),  UCL Institute of Education (IOE), and the University of London with a focus on technologies for teaching and learning, including libraries and administration.
See www.ble.ac.uk for more information.

HeLF – Electronic Management of Assessment (EMA) – 18th June 2013, Falmer

Martin Burrow21 June 2013

Some thoughts /notes on Tuesdays meeting.

 

The first presentation was an overview of the HeLF EMA survey. This was a poll of HeLF members about where they thought their institutions are/will be.Desk with paper

(Available at http://www.slideshare.net/barbaranewland/ and the quantitative data is at http://w01.helfcms.wf.ulcc.ac.uk/projects.html)

It was noted that this type of survey only captures respondents ‘best guesses’ about what is going on – so more a confirmation of expectations rather than any hard data. The main point to note was that very few institutions had an institution-wide policy on e-assessment. The survey split out e-assessment into component parts, e-submission, e-marking, e-feedback, and e-return and it was generally agreed that this was a good thing because they all had their own requirements/challenges.

There was not a lot of mention about the drivers for moving more to EMA, but the predominant factor was student expectations (National Student Survey results mentioned). No great clamour from the staff side and I did get the feeling this was one of those things being pushed by the techies.

People that were doing work on implementing EMA were doing some process mapping to allow them to benchmark what was going on and also to inform any policies that were written. The 4 areas mentioned above were split into constituent steps and these were mapped to range of ways/technologies that could be used to complete these steps. Done both for ‘as it stands now’ and ‘where we would like to move to’  This process mapping was generally done on a school by school basis. The resulting data looked pretty useful and this would definately be a starting point for anyone wanting to pilot/encourage EMA.

Discussion about institutional policy revolved around what level it was appropriate to be set at; institution, dept, school etc. How it should sit on the restrictive/encouraging balance, how IT systems integrate with manual/paper based systems, and probably easiest of all, how it should deal with IT system failures – fall back processes, extensions etc.

There was lots of talk about the difficulties in encouraging e-marking, with lots of evidence of markers preferring paper based marking. My personal take on it is that if you enforce e-submission, e-feedback, and e-return, you can leave the marking (notice here I didnt say e-marking) as a ‘black box’ component, up to personal preference to individual markers – with that caveat that however they choose to mark, their output (grades feedback etc) has to be entered back into the system in electronic format. Ways mentioned to encourage e-marking were allocation of hardware (iPads, large or second PC monitor screens) and extended time periods for marking. The was no evidence that any of these had either large or widespread effect on the uptake of e-marking.

Other points to note were that students were very keen on marking/feedback within a published rubric/schema system, and that using such a system also eased the burden on the markers side. Some institutions (University of the Arts) were introducing cross-department, generic marking criteria that could apply to different subjects.

Also, on the wish list side, there was demand from staff and students for a tool where you could see all a student’s feedback for their whole time at the institution, across all courses and submission points.

All in all, it was a nicely informative little session, well worth being present at.

image from ralenhill Flickr

 

 

 

 

Santa uses Grademark.

Domi C Sinclair20 December 2012

Have you ever wondered how Santa manages to grade the naughty and nice list so fast? Well the answer is technology! Just like many academic staff he uses Grademark, and very efficiently at that.

The text accompanying the video, posted by Turnitin on the video sharing site Vimeo, reads:

‘Every December, millions of children around the world write letters to Santa, explaining how they’ve been good boys and girls and letting him know what they want to see under their trees come December 25th.

Over the years, the number of kids sending him letters skyrocket. His mailbox was flooded and he found himself buried in letters, unable to respond to all of them.

One day, a little elf told Santa about Turnitin—how he could use it to accept submissions from the children, check the letters for originality, give immediate feedback, and even use rubrics to help determine if they’ve been naughty or nice. So he gave it a shot.

Share this video with your colleagues, especially the ones that look like they’ve been in an avalanche of essays.’

Watch the video and see how Santa does it.

How Santa grades millions of Christmas letters

Certainty Based Marking Webinar

6 April 2011

Emeritus Professor Tony Gardner-Medwin gave a Webinar presentation on Wednesday 6th April about using Certainty Based Marking (CBM) for both formative self-tests and summative e-exams.

This type of assessment helps students to understand what areas of a topic they really do know and what areas they need to work on by asking them to choose, on a 3 point scale, how confident they are that their answer is correct.

Questions they answer correctly and with high certainty score the most points, while those they answer correctly with low certainty score fewer points. Questions they get wrong are negatively marked in a similar fashion.

The scoring method is best demonstrated in the following table:

Certainty level No reply C=1 C=2 C=3
Mark if correct 0 1 2 3
Penalty if incorrect (T/F Q) 0 0 -2 -6

When used formatively, students can review their marks and focus on reviewing the material where they were either unsure of an answer or confident of their answer, but incorrect.

Certainty Based Marking can also be used for exams. Evidence has shown that exam results evaluated using CBM closely matches ( tending to be slightly higher) than the scores the students would have received if traditional incorrect/correct marking were used. This can be easily compared, because each CBM exam result can be marked in both ways.

Find out more about Certainty Based Marking here: http://www.ucl.ac.uk/lapt

The CBM Webinar presentation will shortly be available here: http://transformingassessment.com

UPDATE: The Webinar and related materials are now available from here: http://www.ucl.ac.uk/~ucgbarg/pubteach.htm

Effective Assessment in a Digital Age

Jessica Gramp9 February 2011

On February 3rd practitioners from universities in and around the region met in Birmingham to discuss how technology can be used to promote effective learning by looking at good practice in assessment and feedback.

The workshops were based around the principles from the Effective Assessment in a Digital Age: A guide to technology-enhanced assessment and feedback publication.

Some of the ideas that emerged from the workshop activities are summarised here:

  • Set an assessment where group members contribute to a forum as they collect research towards a final outcome
  • Set an assessment where individuals produce a poster illustrating the information they have sourced in their research.
  • Set formative assessment for complex questions that the majority of students are likely to fail towards the beginning of a course, so they become familiar with learning from their mistakes in a safe and productive way.
  • Review students’ answers to assessments to see which questions many students got wrong and support them in understanding why and how to reach the correct answer.
  • Develop formative assessments that reveal hints to the correct answer and allow students to have another go if they get it wrong initially and when they do get it right (or wrong a number of times) explain the correct answer in detail.
  • Use text matching technology to produce free-text, short-answer questions, rather than the commonly used multiple choice question type. Note: To do this effectively can take time and requires large quantities of real student answers to mark accurately, so may only be viable to large cohorts of students.
  • Use various assessment methods to cater for different learning styles, engage students and allow those who have strengths in some areas to take advantage of these.
  • Assess frequently throughout the term to allow tutors to evaluate students’ progress and steer them in the right direction if they begin to go off track before the final submission. This also allows tutors to distribute the time they spend providing feedback and marking across the term, rather than the marking and feedback process being concentrated at the end.

The output from the workshops and other useful materials are available here: http://bit.ly/jiscassess