Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

  • Subscribe to the Digital Education blog

  • Meta

  • Tags

  • A A A

    Archive for the 'e-Assessment' Category

    Bug in duplicated Moodle assignments

    By Rod Digges, on 8 December 2016

    We’ve recently come across a bug in Moodle (not Turnitin) assignments. The bug shows up when a blind marking/anonymous Moodle assignment that has been used and student identities revealed is then copied for re-use. The copy of the assignment will look from its settings like a blind marking/anonymous assignment but it will behave as if the ‘Reveal student identities’ link had been clicked and student names will be visible in both the grading interface and the course gradebook. The quickest way to check if a ‘blind marking/anonymous’ assignment is truly in an anonymous state is to click on its link and look for the presence of the ‘Reveal student identities’ link in the assignment’s settings block, if the link is there the assignment is anonymous.

    For the moment we advise that Moodle assignments are not created by duplication of old assignments but are created as completely new assignments.

    screenshot - assignment settings block

    Comparing Moodle Assignment and Turnitin for assessment criteria and feedback

    By Mira Vogel, on 8 November 2016

    Elodie Douarin (Lecturer in Economics, UCL School of Slavonic and Eastern European Studies) and I have been comparing how assessment criteria can be presented to engage a large cohort of students with feedback in Moodle Assignment and Turnitin Assignment (report now available). We took a mixed methods approach using questionnaire, focus group and student screencasts as they accessed their feedback and responded to our question prompts. Here are some our key findings.

    Spoiler – we didn’t get a clear steer over which technology is (currently) better – they have different advantages. Students said Moodle seemed “better-made” (which I take to relate to theming issues rather than software architecture ones) while the tutor appreciated the expanded range of feedback available in Moodle 3.1.

    Assessment criteria

    • Students need an opportunity to discuss, and ideally practice with, the criteria in advance, so that they and the assessors can reach a shared view of the standards by which their work will be assessed.
    • Students need to know that criteria exist and be supported to use them. Moodle Assignment is good for making rubrics salient, whereas Turnitin requires students to know to click an icon.
    • Students need support to benchmark their own work to the criteria. Moodle or Turnitin rubrics allow assessors to indicate which levels students have achieved. Moreover, Moodle allows a summary comment for each criterion.
    • Since students doubt that assessors refer to the criteria during marking, it is important to make the educational case for criteria (i.e. beyond grading) as a way of reaching a shared understanding about standards, for giving and receiving feedback, and for self/peer assessment.

    Feedback

    • The feedback comments most valued by students explain the issue, make links with the assessment criteria, and include advice about what students should do next.
    • Giving feedback digitally is legible and easily accessible from any web connected device.
    • Every mode of feedback should be conspicuously communicated to students and suggestions on how to cross-reference these different modes should be provided. Some thoughts should be given to ways to facilitate access to and interpretation of all the elements of feedback provided.
    • Students need to know that digital feedback exists and how to access it. A slideshow of screenshots would allow tutors to hide and unhide slides depending on which feedback aspects they are using.

    Effort

    • The more feedback is dispersed between different modes, the more effortful it is for students to relate it to their own work and thinking. Where more than one mode is used, there is a need to distinguish between the purpose and content of each kind of feedback, signpost their relationships, and communicate this to students. Turnitin offers some support for cross referencing between bubble comments and criteria.
    • It would be possible to ask students to indicate on their work which mode (out of a choice of possibilities) they would like assessors to use.
    • The submission of formative assessment produced with minimal effort may impose a disproportionate burden on markers, who are likely to be commenting on mistakes that students could have corrected easily by themselves. Shorter formative assessment, group works, clearer statements of the benefits of submitting formative work may all help limiting the incidence of low-effort submissions.
    • If individual summary comments have a lot in common, consider releasing them as general feedback for the cohort, spending the saved time on more student-specific comments instead. However, this needs to be signposted clearly to help students cross-reference with their individual feedback.
    • As a group, teaching teams can organise a hands-on session with Digital Education to explore Moodle Assignment and Turnitin from the perspectives of students, markers and administrators. This exposure will help immeasurably with designing efficient, considerate processes and workflows.
    • The kind of ‘community work’ referred to by Bloxham and colleagues (2015) would be an opportunity to reach shared understandings of the roles of students and markers with respect to criteria and feedback, which would in turn help to build confidence in the assessment process.

     

    Bloxham, S., den-Outer, B., Hudson, J., Price, M., 2015. Let’s stop the pretence of consistent marking: exploring the multiple limitations of assessment criteria. Assessment & Evaluation in Higher Education 1–16. doi:10.1080/02602938.2015.1024607

     

    MyFeedback is now available to all UCL staff and students

    By Jessica Gramp, on 17 October 2016

    The MyFeedback dashboard is now available to all UCL students and staff.

    MyFeedback is a new tool in UCL Moodle allowing students to view grades and feedback for any assessed work across all their Moodle courses, in one place. Personal Tutors can view the dashboard for each student to allow them to track progress and to help to inform discussions in personal tutorials.

    Watch the video on how students can use the MyFeedback report:

    The report helps students (supported by their personal tutors) to better understand the variety of feedback they receive, draw ties between different assessments and modules, and allow them to reflect on their feedback to see how they can improve in future assessments. It also allows module tutors and assessors and departmental administrators to see how their students are progressing within the modules they teach and support.

    MyFeedback Feedback Comments tab

    ^ Click the image to view a larger version of the Feedback Comments page.

    MyFeedback is available to students, personal tutors, course tutors and departmental administrators.

    • Students can view feedback and grades from their assessments across all their UCL Moodle course. They can also add self-reflective notes and copy & paste feedback from Turnitin into their report.
    • Personal tutors can see their tutees’ full MyFeedback reports across all the modules their students are studying. Note: personal tutors will not be able to link through to assessments on courses they do not have tutor access to.
    • Module tutors can see MyFeedback reports for their students containing assessment information for any modules they teach. They will not see any assessments for modules they do not teach (unless they have been granted tutor access to those Moodle courses).
    • Departmental administrators can see MyFeedback reports for all the Moodle courses within categories where they have been assigned departmental administrator access in Moodle. Categories in Moodle will either be for the entire  department, or might be broken down further into undergraduate and postgraduate modules. Staff requiring this access will need to ask their department’s current category level course administrator to assign them this role.

    Sign up to the Arena Exchange MyFeedback workshop on 28th November 2016 to learn how to use this tool with your students.

    You can navigate to your own MyFeedback reports via the MyFeedback block on the UCL Moodle home page.

    Other institutions can download the plugin from Moodle.org.

    Find out more about MyFeedback…

     

    Understanding the essence(s) of portfolio-based learning

    By Domi C Sinclair, on 15 June 2016

    Last week saw the first ever joint AAEEBL and CRA conference, hosted in Edinburgh between 6th – 8th June 2016 whioch was titled, ‘Understanding the essence(s) of portfolio-based learning’. For those who don’t  know AAEEBL is a US based global portfolio organisation, it stands for the Association for Authentic, Experiential and Evidence Based Learning. CRA is a very similar UK based organisation, with it’s name standing for the Centre for Recording Achievement. So, as you can imagine this was a portfolio conference.

    There were 3 key themes that emerged from the conference. These themes kept popping up in presentations and discussion :

    Scaffolding
    Process not product
    Cultural shift/ change

    Let’s look briefly at these themes below, but if you would like a more detailed look them please see the AAEEBL/ CRA Conference 2016 on my personal blog.

    The first theme,  scaffolding, refers to the importance of having structure around portfolio activities. This predominately broke down into conversations about templates and frameworks for guiding staff and students without restricting them. Templates can be useful for giving students a little bit of direction without restricting their creative freedom (depending on the content and detail of the template). They are also useful because, anecdotally, students can find it overwhelming to simply be given a blank space to do with as they please. A template gives students a starting place. In relation to frameworks this was mostly a discussion about their usefulness for staff, to help give them some scaffolding from which to build a portfolio activity into their module or course, either as a single assessment or as an on-going activity to support learning via reflective practice. It was thought that this framework should be fairly high level, meaning it was not too prescriptive and not software dependant.

    This actually leads quite nicely into the next theme, process not product. There was a strong emphasis on focusing on the process and pedagogy of portfolios and not the product (either meaning the final output or the technological product used to facilitate them). It is easy to become distracted by debating whether you are using the best online portfolio system. At the moment UCL use MyPortfolio, which is based on the Mahara platform. As good practice we will be reviewing the use of this platform in the near future, however whether we use Mahara, WordPress or Office 365 the process of running a successful portfolio is the same and the buttons are not as important as strong pedagogy.

    The final theme is perhaps the one that has the biggest impact for portfolio, especially online portfolio adoption at institutions, and that is the need for a cultural shift/ change. This is perhaps best summarised by an analogy that was used by Trent Batson (President/CEO of AAEEBL) at the conference. He was talking about the American automobile and how it took 35 years to become fully part of US culture. First they invented the automobile and it opened up a lot of possibilities, such as people being able to commute more easily for work. But even after this it still took time to build all the roads, parking spaces and petrol stations needed. The idea was proven but it took a lot longer for the infrastructure to become part of daily culture. It is fairly easy to see how this relates to portfolios. There are a number of case studies out there to prove their potential, however the infrastructure to support them is not fully part of the culture of universities. Portfolios tend to expose the learning process which can be an intimidating prospect for both students and staff a like. However, portfolios can offer a very useful reflective space where you can use journals to do written reflections, and also reflect whilst curating examples of work you have produced that you are going to include in your portfolio. Reflection gives us the ability to stop and think about our thinking, and to understand how we can do better moving forward.

    Introducing the new E-Learning Baseline

    By Jessica Gramp, on 7 June 2016

    UCL E-Learning Baseline 2016The UCL E-Learning Baseline is now available as a printable colour booklet. This can be downloaded from the UCL E-Learning Baseline wiki page: http://bit.ly/UCLELearningBaseline

    The 2016 version is a product of merging the UCL Moodle Baseline with the Student Minimum Entitlement to On-Line Support from the Institute of Education.

    The Digital Education Advisory team will be distributing printed copies to E-Learning Champions and Teaching Administrators for use in departments.

    Please could you also distribute this to your own networks to help us communicate the new guidelines to all staff.

    Support is available to help staff apply this to their Moodle course templates via digi-ed@ucl.ac.uk.

    We are also working on a number of ideas to help people understand the baseline (via a myth busting quiz) and a way for people to show their courses are Baseline (or Baseline+) compliant by way with a colleague endorsed badge.

    See ‘What’s new?’, to quickly see what has changed since the last 2013 Baseline.

     

    An even better peer feedback experience with the Moodle Workshop activity

    By Mira Vogel, on 21 December 2015

    This is the third and final post in a series about using the Moodle Workshop activity for peer feedback, in which I’ll briefly summarise how we acted on recommendations from the second iteration which in turn built on feedback from the first go. The purpose is to interpret pedagogical considerations as Moodle activity settings.
    To refresh your memories, the setting is the UCL Arena Teaching Association Programme in which postgraduate students, divided into three cognate cohorts, give and receive peer feedback on case studies they are preparing for their Higher Education Academy Associate Fellowship application. Since the activity was peer feedback only, we weren’t exploiting the numeric grades, tutor grades, or grade weighting capabilities of Moodle Workshop on this occasion.
    At the point we last reported on Moodle Workshop there were a number of recommendations. Below I revisit those and summarise the actions we took and their consequences.

    Improve signposting from the Moodle course area front page, and maybe the title of the Workshop itself, so students know what to do and when.

    We changed the title to a friendly imperative: “Write a mini case study, give peer feedback”. That is how the link to it now appears on the Moodle page.

    Instructions: let students know how many reviews they are expected to do; let them know if they should expect variety in how the submissions display.

    Noting that participants may need to click or scroll for important information, we used the instructions fields for submissions and for assessment to set out what they should expect to see and do, and how. In instructions for Submission this included word count, how to submit, and that their names would appear with their submission. Then the instructions for Assessment included how to find the allocation, a rough word count for feedback, and that peer markers’ names would appear with their feedback (see below for more on anonymity). The Conclusion included how to find both the original submission and the feedback on it.
    In the second iteration some submissions had been attachments while others had been typed directly into Moodle. This time we set attachments to zero, instead requiring all participants to paste their case studies directly into Moodle. We hoped that the resulting display of submission and its assessment on the same page would help with finding the submission and with cross-referencing. Later it emerged that there were mixed feelings about this: one participant reported difficulties with footnotes and another said would have preferred a separate document so he could arrange the windows in relation to each other, rather than scrolling. In future we may allow attachments, and include a line in the instructions prompting participants to look for an attachment if they can’t see the submission directly in Moodle.
    Since the participants were entirely new to the activity, we knew we would need to give more frequent prompts and guidance than if they were familiar with it. Over the two weeks we sent out four News Forum posts in total at fixed times in relation to the two deadlines. The first launched the activity, let participants know where to find it, and reminded them about the submission deadline; the second, a couple of days before the submission deadline, explained that the deadline was hard and let them know how and when to find the work they had been allocated to give feedback; the third reminded them of the assessment deadline; the fourth let them know where and when to find the feedback they had been given. When asked whether these emails had been helpful or a nuisance, the resounding response was that they had been useful. Again, if students had been familiar with the process, we would have expected to take a much lighter touch on the encouragement and reminders, but first times are usually more effort.

    Consider including an example case study & feedback for reference.

    We linked to one rather than including it within the activity (which is possible) but some participants missed the link. There is a good case for including it within the activity (with or without the feedback). Since this is a low-stakes, voluntary activity, we would not oblige participants to carry out a practice assessment.

    Address the issue that, due to some non-participation during the Assessment phase, some students gave more feedback than they received.

    In our reminder News Forum emails we explicitly reminded students of their role in making sure every participant received feedback. In one cohort this had a very positive effect with participants who didn’t make the deadline (which is hard for reasons mentioned elsewhere) using email to give feedback on their allocated work. We know that, especially with non-compulsory activities and especially if there is a long time between submitting, giving feedback and receiving feedback, students will need email prompts to remind them what to do and when.

    We originally had a single comments field but will now structure the peer review with some questions aligned to the relevant parts of the criteria.

    Feedback givers had three question prompts to which they responded in free text fields.

    Decide about anonymity – should both submissions and reviews be anonymous, or one or the other, or neither? Also to consider – we could also change Permissions after it’s complete (or even while it’s running) to allow students to access the dashboard and see all the case studies and all the feedback.

    We decided to even things out by making both the submissions and reviews attributable, achieving this by changing the permissions for that Moodle Workshop activity before it ran. We used the instructions for submissions and assessment to flag this to participants.
    A lead tutor for one of the cohorts had been avoiding using Moodle Workshop because she felt it was too private between a participant their few reviewees. We addressed this after the closure of the activity by proposing to participants that we release all case studies and their feedback to all participants in the cohort (again by changing the permissions for that Moodle Workshop activity). We gave them a chance to raise objections in private, but after receiving none we went ahead with the release. We have not yet checked the logs to see whether this access has been exploited.

    Other considerations.

    Previously we evaluated the peer feedback activity with a questionnaire, but this time we didn’t have the opportunity for that. We did however have the opportunity to discuss the experience with one of the groups. This dialogue affirmed the decisions we’d taken. Participants were positive about repeating the activity, so we duly ran it again after the next session. They also said that they preferred to receive feedback from peers in their cognate cohort, so we maintained the existing Moodle Groupings (Moodle Groups would also work if the cohorts had the same deadline date, but ours didn’t, which is why we had three separate Moodle Workshop instances with Groupings applied).
    The staff valued the activity but felt that without support from ELE they would have struggled to make it work. ELE is responding by writing some contextual guidance for that particular activity, including a reassuring checklist.