X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Archive for the 'Our Views' Category

ABC learning design and the challenges of online

Clive Young26 April 2021

ABC learning design is UCL’s widely used ‘sprint’ method to help busy university and college teachers review and redesign their courses for blended modes.

Originally run as face-to-face workshops, in just 90 minutes teaching teams work together to create a visual ‘storyboard’ of activities representing the student journey. Assessment methods, programme-level themes and institutional policies can all be integrated easily. International ABC user groups soon emerged to share ideas, translations and localisations.

The Covid crisis impacted in two ways; one positive, one negative.

As even traditionally minded universities were forced to ‘pivot’ rapidly to online learning provision, the advantages of digital modes and the need for learning design suddenly became clearer. However traditional ABC on-campus workshops were impossible. The community responded swiftly by experimented with online approaches.

This academic year Clive Young and Nataša Perović, the UCL originators of ABC, created and trialled their own online version based on Google Jamboard and UCL’s Learning Designer tool.

In a popular webinar last week Clive and Nataša showed how this works and compared on-campus and online versions.

A recording of this webinar is now available via: Zoom

The presentation: ABC LD and the challenges of online Webinar, PDF 4.5Mb

You can also find earlier webinars on ABC

 

Marking centrally managed exams in 2021

Steve Rowett22 March 2021

On this page:

Please note that this page will be updated regularly.


Background

As part of UCL’s continued COVID-19 response, centrally managed examinations for 2021 will be held online. Approximately 19,000 students will undertake over 1,000 exam papers resulting in about 48,000 submitted pieces of work. These exams are timetabled, and (for the very most part) students will submit a PDF document as their response. Students have been provided with their exam timetable and guidance on creating and submitting their documents. The exception to this is some ‘pilot’ examinations that are taking place using other methods on the AssessmentUCL platform, but unless you are within that pilot group, the methods described here will apply.

The move to online 24 hour assessments that replace traditional exams leads to a challenge for those that have to grade and mark the work. This blog post updates a similar post from last year with updated guidance, although the process is broadly the same.

Physical exams are being replaced with 24 hour online papers, scheduled through the exam timetabling system. Some papers will be available for students to complete for the full 24 hours, in other cases students ‘start the clock’ themselves to take a shorter timed exam within that 24 hour window.

We start from a place of two knowns:

  • Students are submitting work as a PDF document to the AssessmentUCL platform during the 24 hour window; and
  • Final grades need to be stored in Portico, our student records system.

But in between those two endpoints, there are many different workflows by which marking can take place. These are set out by the UCL’s Academic Manual but encompass a range of choices, particularly in how second marking is completed. One key difference between regular courseworks is that this is not about providing feedback to students, but instead about supporting the marking process, the communication between markers and the required record of the marking process. At the end of the marking process departments will need to ensure that scripts are stored securely but can be accessed by relevant staff as required, much in line with requirements for paper versions over previous years.

There is no requirement to use a particular platform or method for marking, so there is considerable flexibility for departments to use processes that work best for them. We are suggesting a menu of options which provide a basis for departments to build on if they so choose. We are also running regular training sessions which as listed at the foot of this document.

The menu options are:

  • Markers review the scripts and mark or annotate them using AssessmentUCL’s annotation and markup tools;
  • Departments can download PDF copies of scripts which can be annotated using PDF annotation software on a computer or tablet device;
  • Markers review the scripts on-screen using AssessmentUCL, but keep a ‘marker file’ or notes and comments on the marking process;
  • Markers print the scripts and mark them, then scan them for storage or keep them for return to the department on paper.

The rest of this post goes into these options in more detail. There is also a growing AssessmentUCL resource centre with detailed guidance on exams, which will be launched shortly and this will evolve as the AssessmentUCL platform becomes more widely used across UCL.


Overview of central exam marking

This video provides a short (4 minute) introduction to the methods of marking exam papers in 2021. This video has captions available.


Marking online using AssessmentUCL’s annotation tools

AssessmentUCL provides a web-based interface where comments can be overlaid on a student’s work. A range of second marking options are available to allow comments to be shared with other markers or kept hidden from them. The central examinations team will set up all centrally managed exams based on the papers and information submitted by departments.

The video (24 minutes) below provides a walkthrough of the marking process using the annotation and grading tools in AssessmentUCL. It also shows how module leaders can download PDFs of student papers if they wish to mark using other methods or download marks if they are using AssessmentUCL. This video has captions available.

This video (7 minutes) gives more detailed guidance on ‘section-based marking’ where different markers are marking different questions across the submitted papers. This video has captions available.

Annotation using PDF documents

Where you annotation needs are more sophisticated, or you want to ‘write’ on the paper using a graphics tablet or a tablet and pencil/stylus, then this option may suit you better.

Module leads and exams liaison officers can download a ZIP file containing all the submitted work for a given exam. Unlike last year, a student’s candidate number is prefixed onto the filename, and can be included within the document itself, to make identifying the correct student much easier.

You can then use tools you already have or prefer to use to do your marking. There is more flexibility here, and we will not be able to advise and support every PDF tool available or give precise instructions for every workflow used by departments, but we give some examples here.

Marking on an iPad using OneDrive

Many staff have reported using an iPad with Apple Pencil or an Android tablet with a stylus to be a very effective marking tool. You can use the free Microsoft OneDrive app, or Apple’s built in Files app if you are using an iPad. Both can connect to your OneDrive account which could be a very useful way to store your files. An example of this using OneDrive is shown below, the Apple Files version is very similar.

There’s further guidance from Microsoft on each individual annotation tool.

Marking on a PC or Surface Pro using Microsoft Drawboard PDF

Microsoft Drawboard PDF is a very comprehensive annotation tool, but is only available for Windows 10 and is really designed to be used with a Surface Pro or a desktop with a graphics tablet. Dewi Lewis from UCL Chemistry has produced a video illustrating the annotation tools available and how to mark a set of files easily. Drawboard PDF is available free of charge from Microsoft.

Marking on a PC, Mac or Linux machine using a PDF annotation program.

Of course there are plenty of third party tools that support annotating PDF documents. Some requirement payment to access the annotation facilities (or to save files that have been annotated) but two that do not are Xodo and Foxit PDF.

Things to think about with this approach:

  • Your marking process: if you use double blind marking you might need to make two copies of the files, one for each marker. If you use check marking then a single copy will suffice.
  • You will need to ensure the files are stored securely and can be accessed by the relevant departmental staff in case of any query. You might share the exam submission files with key contacts such as teaching administrators or directors of teaching.
  • Some of the products listed above have a small charge, as would any stylus or pencil that staff would need. These cannot be supplied centrally, so you may need a process for staff claiming back the costs from departments.

Using a ‘marker file’

Accessing the students’ scripts is done using AssessmentUCL, which allows all the papers to be viewed online individually or downloaded in one go. Then a separate document is kept (either one per script, or one overall) containing the marks and marker feedback for each comment. If double-blind marking is being used, then it is easy to see that two such documents or sets of documents could be kept in this way.


Printing scripts and marking on paper

Although we have moved to online submission this year, colleagues are still welcome to print documents and mark on paper. However there is no central printing service available for completed scripts to be printed, and this would have to be managed individually or locally by departments.


The evidence about marking online

In this video Dr Mary Richardson, Associate Professor in Educational Assessment at the IOE, gives a guide to how online marking can differ from paper-based marking and offers some tips for those new to online marking. The video has captions.


Training sessions and support

Digital Education will be running regular training sessions running from week commencing 12 April 2021. These sessions will cover marking using the AssessmentUCL platform and alternative marking methods including using PDF documents. The session is relevant to markers, moderators, the Module Lead and Exams Liaison Officer. This session will run multiple times at the following dates and times:

2pm-3pm Monday 12 April (this session will be captioned)
2pm-3pm Tuesday 13 April
11am-12pm Thursday 15 April
2pm-3pm Monday 19 April
2pm-3pm Wednesday 21 April (this session will be captioned)
11am-12pm Friday 23 April
2pm-3pm Monday 26 April
2pm-3pm Tuesday 27 April (this session will be captioned)
11am-12pm Thursday 29 April
11am-12pm Tuesday 4 May
3pm-4pm Wednesday 5 May
11am-12pm Friday 7 May (this session will be captioned)
2pm-3pm Monday 10 May
11am-12pm Wednesday 12 May
2pm-3pm Monday 17 May
2pm-3pm Thursday 20 May
2pm-3pm Tuesday 26 May
2pm-3pm Thursday 3 June

There is no need to book for these sessions, you can just join on the day (UCL Login required).

There is a recording of one of these training sessions that you can watch (UCL staff login required) and the slides used in the training session which can be downloaded.

There are also daily drop-ins that run from 3pm-4pm every weekday (except bank holidays). You can find the link for these and join immediately.

You can of course contact UCL Digital Education for further help and support.

Once more: Accessible documents from LaTeX

Jim R Tyson7 March 2021

This is blog outlines some changes to the advice I gave previously on how to produce accessible documents using LaTeX. The changes concern the production of PDFs for use digitally, and conversion from LaTeX to HTML.

ISD general guidance on producing accessible materials on its Accessibility Fundamentals pages still holds.

In that previous blog entry, I included as an aim to ‘get as close as possible to producing ‘tagged PDF’ or PDF/UA documents using LaTeX’. This is not currently doable. I replace it with the aim to ‘get as close as possible to producing reasonable accessible documents using LaTeX’. Given the long standing difficulties meeting accessibility requirements from LaTeX source in PDF the advice must be to produce HTML documents when accessibility is required.

In particular, I do not now recommend using the LaTeX package accessibility.sty to create tagged documents. Development of the package has been halted and the author no longer supports its use. If you are interested in the effort to produce tagged PDF from LaTeX source, then you should read this article from the TeX Usergroup newsletter, Tugboat. The author of the package mentioned in the article himself believes it is not yet ready for use in production. But, he writes, “with the tagpdf package it is already possible for adventurous users with a bit of knowledge in TEX programming to tag quite large documents”. I am not adventurous or knowledgeable enough to rise to that challenge.

With respect to mathematical content, I had previously recommended Pandoc which can convert to HTML with machine readable mathematical content. I have since looked more closely at this issue and I now prefer to use tex4ht which has some useful features, including the ability to include the LaTeX code for mathematical content in a page. It is also the package recommended by TUG. There is good documentation on the TUG website. However, tex4ht does not produce Microsoft Word documents from LaTeX, and so Pandoc is still the best tool if that is required. And Pandoc does still do the job if you don’t need extra features.

In the light of these and other issues, I have made the switch completely to using RMarkdown. This allows me to mix lightweight mark up, LaTeX mathematical code and HTML in one document. Using HTML to insert graphics allows me to include alt text which is not otherwise possible.

There is still to my knowledge no solution for presentations made with Beamer or similar packages. Whereas I previously suggested using the package pdfcomment to annotate images on slides made with LaTeX, I do not now since I have discovered that the comments are not well understood by screenreader software.

The current situation means that we can do very little to support colleagues with accessibility issues in LaTeX workflows and especially with respect to presentations and providing alternative text for images, beyond the advice we have already provided.

Call for participants: research study investigating student advisor use of learning analytics dashboards

Samantha Ahern6 January 2021

Participants required for the following study: Uses of learning analytics dashboards / visualisations in student advising

This study has been approved by the UCL Research Ethics Committee, study: 8673/006 and is registered under reference No Z6364106/2020/11/19 social research in line with UCL’s Data Protection Policy.

Learning analytics implementations are predominantly designed for use by those supporting students there is a need to connect the literature on advising and tutoring with any research into the impacts of learning analytics on student behaviour. In terms of both learning behaviours and welfare (wellbeing and mental health).

This research aims to provide an overview of what is currently perceived as best practice in advising and tutoring. In this context, we will critically review the current literature on dashboard/visualisation design and investigate their use by student advisors with the aim of identifying any synergies and conflicts that exist. With the aim of providing recommendations on how to improve dashboard / visualisation design.
The study is looking to recruit HEI Student Advisors (incl. Personal tutors) to share their experiences of using Learning Analytics dashboards/ visualisations. This will initially be via an online survey.

For details of the study please view the study’s Information Sheet.

If you would like to participate in the study please visit the online survey.

The project lead is: Samantha Ahern, Digital Education – Information Services Division

 

 

Kindness, community and pedagogies of care

Samantha Ahern8 December 2020

Research has shown that kindness has a positive effect on the giver to varying degrees. But, how do we embed this into our communities and develop pedagogies of care? How can we use kindness as a means of combating growing social isolation and loneliness?

 

In this context I am not referring to random acts of kindness, but relational and radical kindness. Relational kindness enables deep, meaningful connections between individuals by recognising the vulnerabilities and complexities of relationships. Radical kindness perceives kindness as a collective and state enabled response to inequality. It requires connection across differences and a recognition that some people’s needs are greater because of structural disadvantage. In an educational context, state could be akin to a an inidivudal institution, department or programme.

The key theme, is relationships and communities. What is needed to facilitate kindness, how can we create kind spaces and how do we create informal opportunities? Can we create informal spaces for students to just “be”? Conversations can be very powerful in bringing people together, but rely on people feeling comfortable and agenda free, neutral, spaces.

In addition to encouraging individual kindness, we need to embed kindness into our own behaviours. In our pedagogy, teaching departments and across our institutions. In the past I have written about digital wellbeing and compassionate pedagogy. Both of these have a role to play in relational and radical kindness, in addition to resources such as Equity Unbound’s Community Building Activites.

However, this is only part of the picture. We also need to consider our culture, how can we make people more important than processes, how to build a culture of trust and how to listen and making meaningful connections.

There have been some fantastic projects from the Kindness Innovation Netowrk on facilitating kindness in communities and in their interactions with local authorities. How can we translate thee lessons to our learning communities?

Compassionate pedagogy is a good start, but we also need to provide students with spaces to be, trusting them, giving them voice and truly enable them to co-construct their learning. Teach to transgress.

Additional resources:

References:

Randomising Questions and Variables with Moodle Quiz

Eliot Hoving8 December 2020

One of the strengths of Moodle Quizzes is the ability to randomise questions. This feature can help deter student collusion.

There are several ways to randomise questions in a Moodle Quiz, which can be combined or used separately. Simple examples are provided here but more complex questions and variables can be created. 

Randomising the Question response options

It’s possible to shuffle the response options within many question types, including Multiple Choice Questions and Matching Question. When setting up a Quiz, simply look under Question behaviour, and change Shuffle within questions to Yes 

Randomising the order of Questions

You can also randomise the order of questions in a Quiz. Click Edit Quiz Questions in your Quiz, click the Shuffle tick box at the top of the page. Questions will now appear in a random order for each student. 

Randomising the Questions

It’s possible to add random questions from pre-defined question Categories. Think of Categories as containers of Quiz questions. They can be based on topic area, e.g. ‘Dosage’, ‘Pharmokinetics‘, ‘Pharmacology’, ‘Patient Consultation’ or they can be based on questions for a specific assessment e.g. ‘Exam questions container 1’, ‘Exam questions container 2’, ‘Exam questions container 3’. 

The first step is to create your Categories.

Then when adding questions to your Quiz, select add a random question and choose the Category. You can also choose how many random questions to add from the Category. 

Under the Add a question option in Moodle Quiz, you can select Add a random question.

For example, if you had a quiz of 10 questions, and you want to give students a random question out of 3 options for each question, you would need 10 Categories, each holding 3 questions e.g. ‘Exam Q1 container’, ‘Exam Q2 container’ … ‘Exam Q10 container’. 

Alternatively, if you want a quiz with 10 questions from ‘Pharmokinetics‘, and 10 from ‘Pharmacology’ you could create the two Categories with their questions, then go to your Quiz and add a random question, select the ‘Pharmokinetics‘ Category, and choose 10 questions. Repeat for ‘Pharmacology’. You now have a 20 question quiz made up of 50%  Pharmokinetics and Pharmacology questions.  

After saving a random question/s you can add further random questions or add regular questions that will appear for all students. Simply add a question from the question bank as normal.  

Be aware, that randomising questions will reduce the reliability of your Moodle Quiz statistics. For example the discrimination index will be calculated on the Quiz question overall, e.g. Q2, not on each variation of the question that may have been randomly selected from, i.e. all the questions from the Exam Q2 container. Each question variation will have fewer attempts compared to if the question was given to all students, so any analytics based on these fewer attempts will be less accurate.  

Randomising variables within Questions:

In addition to randomising questions, certain question types can have randomised variables within them. 

The STACK question type supports complex mathematical questions, which can include random variables. For example you could set some variables, a and b, as follows:

a = rand(6)  (where rand(6) takes a random value from the list [0,1,2,3,4,5,6]).

b = rand(2)  (where rand(2) takes some random value from the list [0,1,2]).

Variables can then be used within questions, so students could be asked to integrate a×xb which thanks to my random variables will generate 21 different questions for my students e.g. integrate 0×x0, 0×x, 0×x2, x0, x, x2, 2x0, 2x, 2x2 …  5x0, 5x, 5x2, 6x0, 6x, 6x2.

Random variants can be generated, tested, and excluded if they are inappropriate, in the above case I might exclude a = 0 as the question equation would evaluate to 0, whereas I want students to integrate a non-zero algebraic expression.  

The Calculated question type also supports randomising variables as well as  basic calculation questions for maths and scientific assessment. Calculated questions can be free entry or multiple choice. For example you could ask students to calculate the area of a rectangle. The width and height would be set to wild card values, let’s call them

{w} for width, and

{h} for height.

The answer is always width × height or {w} × {h} regardless of the values of {w} and {h}. Moodle calls this the answer formula.

The tutor then sets the possible values of {w} and {h} for the student by creating a dataset of possible values for Moodle to randomly select from. To create your dataset, you first define your wild card values e.g. {w} will take some value between 1 and 10, and {h} to take some value from 10 to 20.  You can then ask Moodle to generate sets of your variables, e.g. 10, 50, or 100 possible combinations of {w} and {h} based on the conditions you define. For example, given the conditions above, I could generate the following 3 sets: 

Set 1: {w} = 1, {h} = 14 

Set 2: {w} = 6.2, {h} = 19.3 

Set 3: {w} = 9.1, {h} = 11 

Creating a dataset can be somewhat confusing, so make sure you leave enough time to read the Calculated question type documentation and test it out. Once complete, Moodle can now provide students with potentially 100s of random values of {w} and {h} based on your dataset. Using the answer formula, you provide, Moodle can evaluate the student’s response and automatically grade the student’s answer regardless of what random variables they are given. 

Try a Randomised Quiz

To learn more, take an example randomised quiz on the Marvellous Moodle Examples course.

Speak to a Learning Technologist for further support

Contact Digital Education at digi-ed@ucl.ac.uk for advice and further support.