X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Archive for the 'e-Assessment' Category

Initial release of Marks Transfer available on UCL Moodle!

By Kerry, on 18 March 2024

What is it?

A new UCL Moodle integration is now available to facilitate the transfer of marks from Moodle to Portico, aimed at improving the overall process. The marks transfer integration has been tested during two pilot phases and has received very positive feedback.

You can complete marks transfer for the following assessment scenarios:

  • One Moodle assessment activity is linked to one assessment component in Portico
  • One Moodle assessment activity is linked to multiple assessment components in Portico

Using the following Moodle assessment activity-types:

  • Moodle Assignment
  • Moodle Quiz
  • Turnitin Assignment (single submission)

In addition, the following conditions must be met:

  • A Portico enrolment block must be used to create a mapping with a Module Delivery to your Moodle course.​
  • An assessment component (or components) exists in Portico to map against.​
  • Assessment marks are numerical and 0-100.​
  • The assessment component(s) in Portico is compatible with SITS Marking Schemes and SITS Assessment Types.​
  • For exam assessments, the Portico assessment component is the exam room code EXAMMDLE. 

The flowchart below provides a visual overview of when you can use this initial release of Marks Transfer.

Flowchart indicating when you can use the initial release of Marks Tranfer.

How do I use it?

For guidance on how to use marks transfer, you can view our general overview, how to guide and FAQs.

There will also be demonstration and drop in support sessions: open to all to find out how to use the wizard / ask questions (note – you are welcome to “drop in” to these sessions with questions and do not need to stay for the whole duration). Please use the links below on the specified date / time to join the sessions (alternatively, if you would like to receive a calendar invitation to join one of these sessions, please email currentstudent@ucl.ac.uk specifying which session).

What should I do if I require support?

Please review our detailed FAQs, if you cannot find your answer there, please use one of the following contact points:

  • For any issues with using the marks transfer tool, please contact digi-ed@ucl.ac.uk
  • For any issues with Portico data eg. incorrect assessment or student information, contact lifecycle@ucl.ac.uk
  • Please provide any feedback about the Marks Transfer Wizard to your Faculty Champion.

What next?

Following this initial release, we will continue to develop the integration by adding further marking scenarios and functionality.

Current development priorities are:

  • Grade Book items and categories including external LTI resources
  • Handling of SoRAs, ECs and LSAs
  • Turnitin Multipart Assignments

This is a very exciting development for assessment administration at UCL. We hope you find the new Moodle Marks Transfer integration beneficial!

Many thanks,

Digital Learning Environments and the Student Records Team

Moodle-SITS Marks Transfer Pilot Update

By Kerry, on 9 February 2024

As some of you may be aware, a new Moodle integration is due to be released in the spring which has been designed and developed by the DLE Team to improve the process for transferring marks from Moodle to Portico. It is called the Moodle-SITS Marks Transfer Integration and we are currently trialing this with around 40 course administrators across the institution.

The pilot kicked off on 8 January and will run until 29 February 2024. The purpose of the pilot is to test the Moodle-SITS Marks Transfer Integration using the newly designed Marks Transfer Wizard and its marks transfer functionality that was developed following the Phase 1 Pilot, which took place with a very small group of course administrators at the end of last year. This wizard provides a more streamlined experience for end users by putting the core assessment component information at the centre of the tool which can then be mapped to a selection of Moodle assessments.

Pilot Phase 2 is the last pilot phase before an initial MVP (Minimal Viable Product) release into UCL Moodle Production in late March 2024. Currently, users can take advantage of the integration if the following criteria are met:

  1. They have used the Portico enrolment block to create a mapping with a Module Delivery on their Moodle course.
  2. Either of the following assessment scenarios is true:-
    1. Only one Moodle assessment activity is being linked to one assessment component in SITS.
    2. Only one Moodle assessment activity is being linked to multiple assessment components in SITS.
  3. An assessment component exists in SITS to map against.
  4. The Moodle assessment marks are numerical 0-100.
  5. The assessment component in SITS is compatible with SITS Marking Schemes and SITS Assessment Types.
  6. For exam assessments, the SITS assessment component is the exam room code EXAMMDLE.

The Marks Transfer Wizard currently supports the transfer of marks from one of the following summative assessment activities in Moodle:

  • Moodle Assignment
  • Moodle Quiz
  • Turnitin Assignment (NOT multipart)

We intend to collect feedback on the new Marks Transfer Wizard from pilot participants to improve the interface and workflow for a general UCL-wide release in late March 2024 and also to prioritise next step improvements and developments following the launch.

So far informal feedback has been very positive: users say the assessment wizard works well and will save them a lot of time. The pilot has also been useful for exploring where issues might arise with Portico records or Moodle course administration as well as for gathering frequently asked questions and advice on best practice which will feed into our guidance for wider rollout.

So what are the next steps? Well, we will continue to support our pilot participants until the end of February. In mid-February, the Marks Transfer Assessment Wizard will be updated with some interface improvements so participants will be able to feedback on these too. Towards the end of February, participants will be asked to complete a survey and some will take part in a focus group to help us evaluate the success of the MVP integration and to prioritise our plans for future developments. In addition, our Change Manager is working with us on a communications plan for wider release on UCL Moodle Production and is currently in the process of recruiting a network of champions to cascade guidance and best practice on Moodle-SITS Marks Transfer across UCL, as well as to help us to continue to gather feedback on the user experience. More information about this exciting new development will be available in the coming months!

The Assessment Matrix Resurrected

By Claudia Cox, on 31 October 2023

Credit: Tobias_ET, 2017.

The Digital Assessment Team are pleased to announce a new version of the Assessment Matrix tool, which replaces and expands on a comparison table of the main technologies used for assignments at UCL.

Overview of the Matrix

The Assessment Matrix tool is aimed at helping users in the process of designing assessments decide which platform is best suited for delivery. In addition to providing a quick, visual guide of what different platforms can offer online, users can also download an offline, interactive version of this resource and filter assessment options based on their submission, marking and feedback, and administrative needs.

View the Assessment Matrix.

The tool is expected to be relevant to both academics and administrative staff. It will serve as a valuable starting point for discussions that occur after assessments have been aligned with learning outcomes but before they have been fully developed and designed, and before the platform to be used has been finalised. The decision on which platform should be used will also need to consider faculty approach, tools used for formative assessment and other factors.

The new version contains information on the following platforms:

  • Moodle assignment
  • Turnitin Assigment (in Moodle)
  • Wiseflow – the UCL AssessmentUCL digital assessment platform [NEW]
  • Reflect – the UCL version of the WordPress blogging service [NEW]
  • MyPortfolio – the UCL version of the Mahara eportfolio platform
  • Crowdmark – an assessment tool currently used by the maths department [NEW]

The matrix covers assessment submission options (such as type of formats that can be submitted, if group work is possible and text editor options), marking (such as if double or blind marking are available, audio feedback options and inline annotation) and feedback and administrative settings (such reporting, export of content and is it integrated with SITs).

A screenshot of the updated assessment matrix

Accessibility

Initial feedback on the Assessment Matrix highlighted the importance of testing resources to ensure they meet accessibility requirements. Originally this was intended to have a ‘traffic light’ design to indicate whether tools met users’ needs for assessment, however due to the limited cell colour range available on the Confluence wiki this would not have met WCAG 2.2.

Using staff recommended tools such as COBLIS and TPGi’s Colour Contrast Analyser are great ways to help ensure that any resources and materials are WCAG compliant and allowed us to find a colour scheme that works for a broader audience of users.

Feedback

Digital Assessment Team has been cautious about overloading the matrix to prevent overwhelming its users. However, they are eager to ensure that the matrix remains as user-friendly as possible. If you spot anything that needs updating or editing please contact the Digital Assessment Team (assessment-advisory@ucl.ac.uk).

If you wish to discuss your assessment approach in more detail refer to the education support contacts in your faculty and department.

Important Update: STACK upgrade and its impact on existing questions

By Aurelie, on 19 July 2023

The STACK Moodle plugin is getting an upgrade!

What is STACK?
STACK is a powerful system for creating and managing online assessments in mathematics, science, and related disciplines. It’s integrated into our Moodle platform, providing a seamless experience for creating complex, auto-graded questions.

The new version, STACK 4.4, was initially released in July 2022 and is now being implemented in our Moodle platform on Tuesday 25th July. This major update focuses on enhancing the performance and overcoming limitations of the previous systems, particularly in the Potential Response Tree (PRT) and Computer Algebra System Text (CASText) systems.

However, it’s important to note that the new version of STACK has become more stringent in terms of logic and syntax. As a result, certain questions may not function as they did before. This is due to the enhanced logic/syntax checks incorporated in the new version, aimed at improving the accuracy and consistency of STACK assessments.

We have conducted tests and identified the questions that may be affected by this change. These are listed in a report on our wiki (you need to be logged in with your UCL account to view the report). We understand that this may require adjustments to your current questions and apologise for any inconvenience this may cause.
We recommend reviewing and updating your questions as necessary to ensure they function correctly after the upgrade.

To help you navigate these changes, you can review the list of key issues that might arise and their solutions in the Release Notes.

As always, we’re here to support you during this transition. If you have any questions or need assistance, please don’t hesitate to reach out to the Digital Education team.

Thank you for your understanding and cooperation as we work to improve our Moodle platform and assessment tools.

Managing mark release during the marking and assessment boycott

By Marieke Guy and Zaman Wong, on 13 June 2023

This post gives useful information for admin staff on how to manage mark release and mark upload during the Marking and Assessment boycott.

Using AssessmentUCL/Wiseflow

Step 1: Identifying students who have and have not been marked

1.1 Identify students who have been given a final grade:

Students that have been marked and given a final grade can be identified by Administrators (under the Manager role) by downloading the ‘grade export’ report (image below).

  • Student details (candidate number, student ID, names – these columns can be hidden/shown as desired)
  • Students that have submitted / not submitted
  • Students that have been given a final grade (if blank – no grade has been agreed, but marking may have taken place – please see section 1.2)

Guidance to download report.

Marking

1.2 Identify students that have been (by a first or second marker) but not been given a final grade

Administrators should add themselves as Reviewers on their assessments, which will allow them to download a grade sheet which will display a list of candidates and any marks that have submitted by individual markers (including the name of the marker). If you have issues with adding yourself as a Reviewer, please submit a staff query form to request access.

Once you have opened the assessment in the Reviewing tab, you should select the Offline marking option and follow the steps to export the grade sheet:offline marking

The downloaded grade sheet will show you a list of candidates and any marks that have been submitted by first or second markers (highlighted in red in image below):

Greadesheet

Please note that if the Grade column is empty, this means that no grades have been finalised and a Reviewer will need to submit a finalised grade for students that have been marked (this will allow administrators to complete the grade export to Portico in Step 3).

Guidance

Step 2: Allow students without grades/feedback to be marked after the original marking deadline has concluded:

Student grades and feedback are released on the platform under two conditions; once the marking end date has arrived, and the ‘Show final grades’ option has been enabled.

To allow remaining students to be marked, there are two methods (option b is preferable but may be time consuming if dealing with a large no. of students that have yet to be marked):

  1. a) Administrator / Manager can extend the overall marking end date for all students (to allow for further marking to take place). Caveat: this will mean that students who already have a final grade will not be able to view this on the platform, until the extended marking end date has arrived).

Guidance to Extend overall marking end-date.

  1. b) Administrator / Manager can extend the individual marking end dates for only those students who have not yet been marked (this will mean students that have already been marked will be able to see their final grades on the platform, while allowing markers continue further marking for those that have not been marked).

Guidance to extend individual marking end dates.

Step 3. Grade export to Portico

It is recommended to do this once (when there is a full set of grades), however the grade export button can be pushed more than once (Caveat: if administrator pushes the grade export more than once, you may encounter a ‘Fail’ message for students where their grades were previously exported – this error message can be ignored for those students).

Guidance to complete grade export to Portico.

Using Moodle

Student identities for Moodle assignments and Turnitin assignments cannot be revealed and then hidden again.  Each activity type has a different process to acheive this, which is detailed in our Partial mark entry miniguide.

If you have any queries, please contact the Digital Education team with the assignment title and url at:
digi-ed@ucl.ac.uk

UCL and Jisc event: Reimagining Assessment and Feedback

By Marieke Guy, on 6 June 2023

Earlier this week UCL hosted a one-day event entitled Reimagining Assessment and Feedback, the second in a series of Jisc events on Demonstrating digital transformation.  The event was held in Bentham house, the main building for the UCL Faculty of Laws. The purpose these events is to share best practice from universities who have made significant advances in developing innovative approaches to taking forward their digital agenda. As with the Jisc framework for digital transformation, the events are designed to showcase and highlight the broad spectrum of activity needed across an institution to effectively support and implement a digital culture.

The event organising team , Simon Birkett (Jisc), Peter Phillips (UCL) and Sandra Lusk (UCL)

The event organising team , Simon Birkett (Jisc), Peter Phillips (UCL) and Sandra Lusk (UCL)

The event gave the 50+ delegates the opportunity hear how UCL has evolved its assessment and feedback practices and processes and the role technology plays. Here at UCL we have been at the forefront of the shift to digital assessment and have successfully implemented a digital assessment platform for all centrally managed assessments taken remotely. To achieve this we have needed to address other challenges including assessment design, consistency across programmes, regulations and policy, and enhanced support for professional development.

Opening plenary

The event was opened by Pro-Vice-Provost Education (Student Academic Engagement) Kathryn Woods who talked a little about our wider institutional change programme including the UCL strategic plan 2022-27 consultation and Education framework.

Simon Walker presents

Simon Walker presents on assessment at UCL

Professor Simon Walker (previously Director of Programme Development, UCL, now an educational consultant) and I provided an overview of the UCL assessment journey. We discussed the implementation of Wiseflow/AssessmentUCL and the subsequent challenges we have faced regarding AI and academic integrity. Although we haven’t resolved all the problems, we have encountered numerous challenges and have an valuable story to share. You can see our slides below.

Breakout groups

There were  two sets of breakout group sessions on core themes with lunch slotted in between. Each session featured a UCL facilitator to give an opening introduction, a Jisc scribe to lead the related Padlet board and a UCL student to give the student perspective.

Demonstrating digital transformation – assessment futures

This session considered the potential future of assessment in Higher education. Participants looked at areas including AI, assessment technology and new ideas and ways of working in assessment. The group discussed a whole range of challenges from limited understanding of AI technology and capacity constraints, to AI false alerts, digital inequality,  and ethical considerations. The solutions include student co-design, emphasis on assessment design, oral evaluations and better use of AI for formative assessments.

Demonstrating digital transformation – Academic integrity

Discussions in the academic integrity session

Discussions in the academic integrity session

This session considered ways to ensure academic integrity is maintained across the institution through design, education and detection. It considered how policy and regulations need to change in the light of new challenges that technology brings. Much of the discussion covered current practices such as the use of AI proctoring for remote assessments, efforts to establish clear assessment requirements, and conducting fair assessment misconduct panels. The challenges include terminology clarification, legal concerns surrounding AI usage and the implications of more diverse assessment formats on identifying misconduct. Some of the effective strategies identified were additional training for students during the transition into higher education, varied assessment formats, technical approaches such as random question allocation and limited time allocation, and less punitive approaches to academic integrity such as hand holding through academic practice and referencing requirements.

Demonstrating digital transformation – Institutional change

Kathryn Woods

Kathryn Woods facilitates the breakout session on managing institutional change

This session considered how institutions manage change and encourage new academic practices. The group looked at areas including framing of change and the balance between cultural and technological change. Some of the main challenges explored were around large cohorts, diverse student body, the digital skills of academic staff and general change fatigue. Some successful practices highlighted were feedback from externals and industry experts, personalised feedback at scale, external engagement for formative feedback and audio feedback. The support needed to enable this includes includes surfacing assessment and feedback technologies, integrating professional services into curriculum development teams, and providing timely tech and pedagogic support for staff.

Demonstrating digital transformation -Pedagogy and assessment design

This session considered the full assessment design process and focal points and drivers for different staff involved in making changes. The group looked at areas including what contemporary assessment design looks like: authentic, for social justice, reusable etc. Interesting  practice includes assessment processes that focus on the production and process of assessment, the use of student portfolios for employability, co-designing assessments with employers and utilising creative and authentic assessments with tools like Adobe Creative Suite. The main challenges might be the impact of high-stakes assessment and grades on students, clarifying what is actually being assessed, aligning institutional priorities with assessment innovation and supporting group work assessments. Future support needed could involve utilising the Postgraduate Certificate (PG Cert) program to address assessment and curriculum design with technology and digital skills.

Demonstrating digital transformation -Larger cohorts and workloads

This session considered how you assess large cohorts in highly modularised programmes with large student cohorts from different disciplines. It looked at areas including workload models, interdisciplinary assessment, integrated assessment. There are already examples of interdisciplinary group work, using contribution marks to evaluate individual efforts in group work, implementing peer assessment, utilizing multiple-choice questions (MCQs), and employing marking teams for large cohorts. However these face challenges including PSRB accreditation processes, modularisation of assessments, over-assessment and duplication and scaling alternative assessment practices. One of the best approaches identified is programme-level assessment strategies and embracing the principle of “less is more” by focusing on quality rather than quantity.

Demonstrating digital transformation – Strategic direction

This session considered how you respond to the drivers of change and go about a co-design process across an institution by looking at environmental scanning, involving stakeholders, styles of leadership. The challenges identified involve ensuring continuity while managing future aspirations, considering student demographics and adopting an agile approach to strategy development. Clear communication about assessment and being agile in strategic thinking were identified as practices that work well. The support needed includes access to assessment platforms and curriculum mapping software, partnership support from industry organisations, and collaboration with Jisc and UCISA to advocate for change across the sector.

Panel session

The afternoon panel session on assessment, chaired by Sarah Knight, Head of learning and teaching transformation at Jisc featured a diverse group of experts representing academia and student engagement who all provided valuable insights.

Panel session

Panel session: Mary McHarg, Dr Irene Ctori, Professor Sam Smidt, Dr Ailsa Crum, Marieke Guy, and Sarah Knight

  • Professor Sam Smidt, the Academic Director of King’s Academy, KCL;
  • Dr Irene Ctori, Associate Dean of Education Quality and Student Experience at City, University of London;
  • Mary McHarg, SU Activities and Engagement Sabbatical Officer at UCL;
  • Dr Ailsa Crum, Director of Membership, Quality Enhancement and Standards at QAA;
  • Marieke Guy, Head of Digital Assessment at UCL

Each panellist introduced themselves, explained their roles and organisations, and outlined their current work on assessment. They then shared their key takeaways from the discussions and presentations of the day. These included the need to work together collaboratively as a sector and to look at more fundamental, areas such as curriculum design, as places where change could originate. Some also noted the absence of discussion around feedback, interesting given that NSS scores are very dependent on successful approaches here. The panel addressed important questions, including how higher education providers can better support students’ assessment literacy, ways universities can enable staff to effectively use technology for assessment and feedback, methods to engage in dialogue with PSRBs regarding technology in assessments, and predictions for the future of assessment methods in five years’ time. One of the most interesting questions thrown at the panel was what they would do to assessment if they had a magic want, much of the focus was on the current grading model and other areas such of potential such as improving assessment for students with adjustments by adding in optionality and better support.

The day concluded with an overview of the support available from Jisc provided by Simon Birkett, Senior Consultant (see slides). Tweets from the day can be accessed using the #HEdigitaltx tag. Many of the attendees were then treated to a bespoke UCL tour led by Steve Rowett, Head of the Digital Education Futures team. The highlight for many is Jeremy Bentham’s auto-icon.

It was great to bring together those working in strategic change across UK Higher Education in the area of assessment and feedback.  Clearly there is much work to be done but a sector-wide understanding and appreciation of the difficulties faced, and a unified approach to ensuring quality of student experience and learning benefit can only be a good thing.

This article is reposted from the Digital Assessment Team blog.

Panel members

Panel members