X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Moodle-SITS Marks Transfer Pilot Update

By Kerry, on 9 February 2024

As some of you may be aware, a new Moodle integration is due to be released in the spring which has been designed and developed by the DLE Team to improve the process for transferring marks from Moodle to Portico. It is called the Moodle-SITS Marks Transfer Integration and we are currently trialing this with around 40 course administrators across the institution.

The pilot kicked off on 8 January and will run until 29 February 2024. The purpose of the pilot is to test the Moodle-SITS Marks Transfer Integration using the newly designed Marks Transfer Wizard and its marks transfer functionality that was developed following the Phase 1 Pilot, which took place with a very small group of course administrators at the end of last year. This wizard provides a more streamlined experience for end users by putting the core assessment component information at the centre of the tool which can then be mapped to a selection of Moodle assessments.

Pilot Phase 2 is the last pilot phase before an initial MVP (Minimal Viable Product) release into UCL Moodle Production in late March 2024. Currently, users can take advantage of the integration if the following criteria are met:

  1. They have used the Portico enrolment block to create a mapping with a Module Delivery on their Moodle course.
  2. Either of the following assessment scenarios is true:-
    1. Only one Moodle assessment activity is being linked to one assessment component in SITS.
    2. Only one Moodle assessment activity is being linked to multiple assessment components in SITS.
  3. An assessment component exists in SITS to map against.
  4. The Moodle assessment marks are numerical 0-100.
  5. The assessment component in SITS is compatible with SITS Marking Schemes and SITS Assessment Types.
  6. For exam assessments, the SITS assessment component is the exam room code EXAMMDLE.

The Marks Transfer Wizard currently supports the transfer of marks from one of the following summative assessment activities in Moodle:

  • Moodle Assignment
  • Moodle Quiz
  • Turnitin Assignment (NOT multipart)

We intend to collect feedback on the new Marks Transfer Wizard from pilot participants to improve the interface and workflow for a general UCL-wide release in late March 2024 and also to prioritise next step improvements and developments following the launch.

So far informal feedback has been very positive: users say the assessment wizard works well and will save them a lot of time. The pilot has also been useful for exploring where issues might arise with Portico records or Moodle course administration as well as for gathering frequently asked questions and advice on best practice which will feed into our guidance for wider rollout.

So what are the next steps? Well, we will continue to support our pilot participants until the end of February. In mid-February, the Marks Transfer Assessment Wizard will be updated with some interface improvements so participants will be able to feedback on these too. Towards the end of February, participants will be asked to complete a survey and some will take part in a focus group to help us evaluate the success of the MVP integration and to prioritise our plans for future developments. In addition, our Change Manager is working with us on a communications plan for wider release on UCL Moodle Production and is currently in the process of recruiting a network of champions to cascade guidance and best practice on Moodle-SITS Marks Transfer across UCL, as well as to help us to continue to gather feedback on the user experience. More information about this exciting new development will be available in the coming months!

Updating Our Academic Integrity Resources

By Marieke Guy and i.bowditch, on 10 October 2023

In the ever-evolving landscape of higher education, maintaining academic integrity is paramount. Educational institutions are tasked not only with upholding these standards but also with fostering a culture of academic honesty. At UCL the commitment to academic integrity has led to a revamp of existing resources, driven by a desire to offer the most effective support possible.

We recognise that when it comes to guiding students on academic integrity, a punitive approach falls short. Instead, we want to start with positive framing that taps into the broader motivations of students and positions them as valued contributors to an academic community of practice. The institution does not assume that students inherently understand these practices or that violations should always result in punishment. Rather we view the key causes of plagiarism as opportunities for learning and growth. For instance, Turnitin, a well-known plagiarism detection service, is seen as a tool to assist students in learning rather than merely as a plagiarism detector.

Review and Refresh

At the end of last year, the Digital Assessment Team carried out an audit of academic integrity resources at UCL, which uncovered the need for a refresh. This need became even more pronounced with the advent of Generative Artificial Intelligence (AI). We have now completed the review and refresh of our academic integrity resources for the academic year.

Turnitin Similarity Checker

One of the longstanding resources, the “Plagiarism and Academic Writing for Students” course, has served UCL for over a decade. This course primarily allows students to check their assignments for plagiarism by generating a similarity report through Turnitin. The assignments are not added to the institutional repository, and the course is reset regularly.

The course has now been streamlined to focus solely on explaining Turnitin’s purpose and guiding students on how to create and use the similarity report. An introduction from Ayanna Prevatt-Goldstein, Head of UCL Academic Communication Centre, has been added to give context on how use of Turnitin relates to good academic practice.T o provide a comprehensive experience, an additional section now offers links to other UCL resources related to academic integrity. These are:

  • Academic integrity hub – A student-facing hub area for all guidance on academic integrity including links to information on academic misconduct, academic misconduct panels and Frequently Asked Questions.
  • UCL Academic Communication Centre – The UCL Academic Communication Centre (ACC) supports UCL students to develop their academic language and literacies. We assist students of all language backgrounds, across faculties, at all levels of study, to communicate more effectively in their discipline.

Understanding Academic Integrity Course for Students

UCL has also recently released an updated version of the Understanding Academic Integrity course for students, now hosted on the primary UCL Moodle site: the course previously sat on the UCL Extend platform. This course aims to educate students about all aspects of academic integrity and covers:

  1. How much do I know about academic integrity?
  2. What is academic integrity?
  3. Acknowledging the work of others
  4. Using collaboration positively
  5. Contract cheating
  6. Artificial Intelligence and Academic Integrity
  7. Check your understanding of academic integrity and academic good practice

The revised course content has been built collaboratively with staff and students and incorporates insights from academic integrity and academic writing experts at UCL. It addresses emerging concerns like the use of Generative AI in academia and the course features various elements, including short videos, reflective activities, quizzes, and a final certification quiz.

Students can self-enrol for the course and on completion of all required activities and a success rate in the quiz will receive a certificate of completion, which can serve as evidence of their commitment to academic integrity and be shared with their tutors.

At the start of the course students are asked to post their responses on a mentimeter activity asking  ‘Why do you think students don’t always act with academic integrity?’ . These are the results so far (mid October 2023, 1011 participants, 2547 votes):


To ensure that academic integrity remains current, UCL has devised a plan for annual course refreshers. Annual refreshers are particularly important in the evolving context of Generative AI. Course content on GenAI and its relation to academic integrity will need to be revised in line with both technological and policy developments in this area.

Course video on Artificial Intelligence and Academic Integrity

Older versions of the course are archived to maintain access to logs if needed for academic misconduct panels. In cases where students may still access the previous Extend version, a notice redirects them to the new version on Moodle.

As UCL continues to evolve its approach to academic integrity, it exemplifies a commitment to not just maintaining standards but enhancing the support and resources available to students. This proactive approach ensures that UCL students are well-equipped to navigate the complexities of academic integrity while upholding the institution’s values of learning and growth.

Moodle STACK Quiz question type: deploying variants to avoid quiz crashing

By Aurelie, on 4 May 2022

Questions in STACK can contain randomly generated elements. A student will be given a random variant of a question generated by a pseudo-random seed.

Why deploy variants?

The tutor is strongly advised to pre-generate and “deploy” variants of a question. Not pre-generating question variants Forces Moodle to generate them on the fly – for quizzes with larger numbers of participants this can cause quizzes to crash/freeze.
When a student attempts the question, they will be given a random selection from the deployed variants.

Other reasons for deploying variants of a question:

  • STACK runs all the question tests on each deployed variant to establish each variant of the question is working. This aids quality control. By using question tests, it is unlikely a student will be given a random variant which does not work correctly.
  • The tutor can decide if each deployed variant appears to be of equal difficulty. The tutor can easily delete variants they do not like.

Caution

  • If an author does not deploy any variants (not advised!) then the student gets any random variant.
  • Questions that don’t use randomisation cannot be deployed explicitly. STACK automatically detects randomisation.

How to deploy question variants

The deployment interface can be found by editing a question and clicking on question tests and deployed variants.

  1. The easiest way to do so is to preview the question
  2. Then click the Question tests & deployed variant link on the top right corner.
  3. Click ‘deploy’ if not already deployed.
  4. Next to Attempt to automatically deploy the following number of variants, enter the number of variants  you would like and click Go.
    (depending on the question and the question note content you may be able to deploy various amount; if possible deploy over 30)
    You can preview results and either exclude variants, or return to the quiz question settings to revise the randomisation you have used in the question.
  5. Check variants as required.
  6. This will show the list of currently deployed variants, and links to undeploy all or a specific variant.
  7. Optionally, click ‘Run all tests on all deployed variants (slow):’ and check/undeploy any variants you don’t want to use.

Limitations

There is currently no way to loop systematically over all variants and deploy them all.

Find more details and advice on using STACK question types on the M57 – STACK online assessment for mathematics and science.

Video assignments in Moodle

By Janice Kiugu, on 4 May 2020

To support alternative assessments and in particular the use of Video assignments, a new Moodle plugin that allows for the submission of video/media files is now available. The plugin is accessed within a Moodle assignment and the key additional step in ensuring students can upload video files is to select the option to submit ‘online text’ when setting up an assignment.

In the short term (May until late summer) the Lecturecast (Echo360) video submission plugin will be installed. Following on from that, the aim prior to the start of the 20/21 academic year will be to deploy the Mediacentral video plugin, which will replace the Lecturecast plugin and provide a fuller, richer integration with the UCL media platform – Mediacentral.

The reason the Lecturecast/Echo360 plugin is being installed first is that the Mediacentral plugin is more complex in its integration with Moodle. It requires significantly more testing than the Echo360 plugin and cannot be deployed in time to support the forthcoming assessment period.

A key feature of the Echo360 plugin is that it facilitates the use of the Echo360 mobile application which can be used to:

  • record and upload material from portable devices such as tablets and mobile phones.
  • view Lecture materials, but only if a user has first accessed the course and recordings via their Moodle course page.

Note: The Echo360 mobile application can only be used by UCL registered email addresses.

Support documentation and guidance is available for staff and students

Video assignment guides

Echo 360 Mobile app guides

Case study and additional resources

New E-Book on Assessment, Feedback and Technology

By Tim Neumann, on 1 November 2017

UCL Digital Education Advisory members contributed to a new Open Access e-book that provides valuable insight into the way technology can enhance assessment and feedback. The book was launched formally on 26th October by Birkbeck College Secretary Keith Harrison, with talks from the editors Leo Havemann (Birkbeck, University of London) and Sarah Sherman (BLE Consortium), three case study authors, and event sponsor Panopto.

Havemann, Leo; Sherman, Sarah (2017): Assessment, Feedback and Technology: Contexts and Case Studies in Bloomsbury. London: Bloomsbury Learning Environment.
View and download from: https://doi.org/10.6084/m9.figshare.5315224.v1

 

The Book

E-Book thumbnail

E-Book Cover

The book is a result of a two-year project on e-assessment and feedback run by the Bloomsbury Learning Environment (BLE), a collaboration between five colleges, including the UCL Institute of Education, on issues around digital technology in Higher Education. It contains three research papers which capture snapshots of current practice, and 21 case studies from the BLE partner institutions and a little beyond, thus including practice from wider UCL.

The three papers focus on

  • the use of technology across the assessment lifecycle,
  • the roles played by administrative staff in assessment processes,
  • technology-supported assessment in distance learning.

The case studies are categorised under the headings:

  • alternative [assessment] tasks and formats,
  • students feeding back,
  • assessing at scale,
  • multimedia approaches, and
  • technical developments.

Seven of the 21 case studies were provided by UCL Digital Education colleagues Jess Gramp, Jo Stroud, Mira Vogel (2), and Tim Neumann (3), reporting on examples of blogging, group assessment, peer feedback, assessment in MOOCs, student presentations at a distance, and the UCL-developed My Feedback Report plugin for Moodle.

 

Why you should read the e-book

Launch Event Photo

BLE E-Book Launch Event

As one of the speakers at the entertaining launch event, I suggested three reasons why everybody involved in Higher Education should read this book, in particular the case studies:

  1. Processes in context:
    The case studies succinctly describe assessment and feedback processes in context, so you can quickly decide whether these processes are transferable to your own situation, and you will get a basic prompt on how implement the assessment/feedback process.
  2. Problems are highlighted:
    Some case studies don’t shy away from raising issues and difficulties, so you can judge for yourself whether these difficulties represent risks in your context, and how these risks can be managed.
  3. Practical tips:
    All case studies follow the same structure. If you are in a hurry, make sure to read at least the Take Away sections of each case study, which are full of tips and tricks, many of which apply to situations beyond the case study.

Overall, this collection of papers and case studies on assessment and feedback is easily digestible and contributes to an exchange of good practice.

 

View and Download the Book

The e-book is an Open Access publication freely available below.

For further information, see ble.ac.uk/ebook.html, and view author profiles at ble.ac.uk/ebook_contributors.html

 

About the BLE:
The Bloomsbury Learning Environment is a collaboration between Birkbeck, London School of Hygiene and Tropical Medicine (LSHTM), Royal Veterinary College (RVC), School of Oriental and African Studies (SOAS),  UCL Institute of Education (IOE), and the University of London with a focus on technologies for teaching and learning, including libraries and administration.
See www.ble.ac.uk for more information.

HeLF – Electronic Management of Assessment (EMA) – 18th June 2013, Falmer

By Martin Burrow, on 21 June 2013

Some thoughts /notes on Tuesdays meeting.

 

The first presentation was an overview of the HeLF EMA survey. This was a poll of HeLF members about where they thought their institutions are/will be.Desk with paper

(Available at http://www.slideshare.net/barbaranewland/ and the quantitative data is at http://w01.helfcms.wf.ulcc.ac.uk/projects.html)

It was noted that this type of survey only captures respondents ‘best guesses’ about what is going on – so more a confirmation of expectations rather than any hard data. The main point to note was that very few institutions had an institution-wide policy on e-assessment. The survey split out e-assessment into component parts, e-submission, e-marking, e-feedback, and e-return and it was generally agreed that this was a good thing because they all had their own requirements/challenges.

There was not a lot of mention about the drivers for moving more to EMA, but the predominant factor was student expectations (National Student Survey results mentioned). No great clamour from the staff side and I did get the feeling this was one of those things being pushed by the techies.

People that were doing work on implementing EMA were doing some process mapping to allow them to benchmark what was going on and also to inform any policies that were written. The 4 areas mentioned above were split into constituent steps and these were mapped to range of ways/technologies that could be used to complete these steps. Done both for ‘as it stands now’ and ‘where we would like to move to’  This process mapping was generally done on a school by school basis. The resulting data looked pretty useful and this would definately be a starting point for anyone wanting to pilot/encourage EMA.

Discussion about institutional policy revolved around what level it was appropriate to be set at; institution, dept, school etc. How it should sit on the restrictive/encouraging balance, how IT systems integrate with manual/paper based systems, and probably easiest of all, how it should deal with IT system failures – fall back processes, extensions etc.

There was lots of talk about the difficulties in encouraging e-marking, with lots of evidence of markers preferring paper based marking. My personal take on it is that if you enforce e-submission, e-feedback, and e-return, you can leave the marking (notice here I didnt say e-marking) as a ‘black box’ component, up to personal preference to individual markers – with that caveat that however they choose to mark, their output (grades feedback etc) has to be entered back into the system in electronic format. Ways mentioned to encourage e-marking were allocation of hardware (iPads, large or second PC monitor screens) and extended time periods for marking. The was no evidence that any of these had either large or widespread effect on the uptake of e-marking.

Other points to note were that students were very keen on marking/feedback within a published rubric/schema system, and that using such a system also eased the burden on the markers side. Some institutions (University of the Arts) were introducing cross-department, generic marking criteria that could apply to different subjects.

Also, on the wish list side, there was demand from staff and students for a tool where you could see all a student’s feedback for their whole time at the institution, across all courses and submission points.

All in all, it was a nicely informative little session, well worth being present at.

image from ralenhill Flickr