Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

  • Subscribe to the Digital Education blog

  • Meta

  • Tags

  • A A A

    Archive for the 'Peer Feedback' Category

    MyFeedback is now available to all UCL staff and students

    By Jessica Gramp, on 17 October 2016

    The MyFeedback dashboard is now available to all UCL students and staff.

    MyFeedback is a new tool in UCL Moodle allowing students to view grades and feedback for any assessed work across all their Moodle courses, in one place. Personal Tutors can view the dashboard for each student to allow them to track progress and to help to inform discussions in personal tutorials.

    Watch the video on how students can use the MyFeedback report:

    The report helps students (supported by their personal tutors) to better understand the variety of feedback they receive, draw ties between different assessments and modules, and allow them to reflect on their feedback to see how they can improve in future assessments. It also allows module tutors and assessors and departmental administrators to see how their students are progressing within the modules they teach and support.

    MyFeedback Feedback Comments tab

    ^ Click the image to view a larger version of the Feedback Comments page.

    MyFeedback is available to students, personal tutors, course tutors and departmental administrators.

    • Students can view feedback and grades from their assessments across all their UCL Moodle course. They can also add self-reflective notes and copy & paste feedback from Turnitin into their report.
    • Personal tutors can see their tutees’ full MyFeedback reports across all the modules their students are studying. Note: personal tutors will not be able to link through to assessments on courses they do not have tutor access to.
    • Module tutors can see MyFeedback reports for their students containing assessment information for any modules they teach. They will not see any assessments for modules they do not teach (unless they have been granted tutor access to those Moodle courses).
    • Departmental administrators can see MyFeedback reports for all the Moodle courses within categories where they have been assigned departmental administrator access in Moodle. Categories in Moodle will either be for the entire  department, or might be broken down further into undergraduate and postgraduate modules. Staff requiring this access will need to ask their department’s current category level course administrator to assign them this role.

    Sign up to the Arena Exchange MyFeedback workshop on 28th November 2016 to learn how to use this tool with your students.

    You can navigate to your own MyFeedback reports via the MyFeedback block on the UCL Moodle home page.

    Other institutions can download the plugin from Moodle.org.

    Find out more about MyFeedback…

     

    A good peer review experience with Moodle Workshop

    By Mira Vogel, on 18 March 2015

    Update Dec 2015: there are now three posts on our refinements to this peer feedback activity: one, two, and three.

    Readers have been begging for news of how it went with the Moodle Workshop activity from this post.

    Workshop is an activity in Moodle which allows staff to set up a peer assessment or (in our case) peer review. Workshop collects student work, automatically allocates reviewers, allows the review to be scaffolded with questions, imposes deadlines on the submission and assessment phase, provides a dashboard so staff can follow progress, and allows staff to assess the reviews/assessments as well as the submissions.

    However, except for some intrepid pioneers, it is almost never seen in the wild.

    The reason for that is partly to do with daunting number and nature of the settings – there are several pitfalls to avoid which aren’t obvious on first pass – but also the fact that because it is a process you can’t easily see a demo and running a test instance is pretty time consuming. If people try once and it doesn’t work well they rarely try again.

    Well look no further – CALT and ELE have it working well now and can support you with your own peer review.

    What happened?

    Students on the UCL Arena Teaching Associate Programme reviewed each others’ case studies. 22 then completed a short evaluation questionnaire in which they rated their experience of giving and receiving feedback on a five-point scale and commented on their responses. The students were from two groups with different tutors running the peer review activity. A third group leader chose to run the peer review on Moodle Forum since it would allow students to easily see each others’ case studies and feedback.

    The students reported that giving feedback went well (21 respondents):

    Pie chart - giving feedback

    Satisfaction with reviewing work – click to enlarge

    This indicates that the measures we took – see previous post – to address disorientation and participation were successful. In particular we were better aware of where the description, instructions for submission, instructions for assessment, and concluding comments would display, and put the relevant information into each.

    Receiving feedback also went well (22 respondents) though with a slightly bigger spread in both directions:

    Pie chart - receiving feedback

    Satisfaction with receiving reviews – click to enlarge

     

    Students appreciated:

    • Feedback on their work.
    • Insights about their own work from considering others’ work.
    • Being able to edit their submission in advance of the deadline.
    • The improved instructions letting them know what to do, when and where.

    Staff appreciated:

    This hasn’t been formally evaluated, but from informal conversations I know that the two group leaders appreciate Moodle taking on the grunt work of allocation. However, this depends on setting a hard deadline with no late submissions (otherwise staff have to keep checking for late submissions and allocating those manually) and one of the leaders was less comfortable with this than the other. Neither found it too onerous to write diary notes to send reminders and alerts to students to move the activity along – in any case this manual messaging will hopefully become unnecessary with the arrival of Moodle Events in the coming upgrade.

    For next time:

    • Improve signposting from the Moodle course area front page, and maybe the title of the Workshop itself, so students know what to do and when.
    • Instructions: let students know how many reviews they are expected to do; let them know if they should expect variety in how the submissions display – in our case some were attachments while others were typed directly into Moodle (we may want to set attachments to zero); include word count guidance in the instructions for submission and assessment.
    • Consider including an example case study & review for reference (Workshop allows this).
    • Address the issue that, due to some non-participation during the Assessment phase, some students gave more feedback than they received.
    • We originally had a single comments field but will now structure the peer review with some questions aligned to the relevant parts of the criteria.
    • Decide about anonymity – should both submissions and reviews be anonymous, or one or the other, or neither? These can be configured via the Workshop’s Permissions. Let students know who can see what.
    • Also to consider – we could also change Permissions after it’s complete (or even while it’s running) to allow students to access the dashboard and see all the case studies and all the feedback.

    Have you had a good experience with Moodle Workshop? What made it work for you?

    Peer review with the Moodle Workshop activity – a close look

    By Mira Vogel, on 13 February 2015

    Example of a Moodle Workshop activity dashboard

    Example of a Moodle Workshop activity dashboard

    Update Dec 2015: there are now three posts on our refinements to this peer feedback activity: one, two, and three.

    Working with E-Learning Environments, UCL Arena Teaching Associate Programme leaders in CALT have been trialling Workshop (Moodle’s peer assessment tool) to run a peer review activity with participants. We’re now on the second iteration – here are some opportunities and lessons learned. One group is using Moodle Forum, so in a month or so we’ll be able to compare the two platforms – but for now I’ll focus on the Workshop.

    The scenario

    Participants write a 500 word case study about an aspects of learning, teaching and assessment mapped to aspects of the UK Professional Services Framework, and review three others. The review takes the form of summary comments (i.e. no numeric marks, no rubric, no structured questions to answer). They have roughly a week to prepare the submission and a week to carry out the assessments. Participation is strongly encouraged but not compulsory.

    From the evaluation of the first iteration

    36 participants gave feedback at the end of 2014. 29 participants found the experience of giving assessment positive (fine, good or excellent, 12, 14 and 3 respectively) while 7 found it unsatisfactory or poor (5 and 2 respectively). Receiving assessment was less positive (fine, good or excellent, 6, 3 and 0 respectively) while 4 found it unsatisfactory and 3, poor.

    The gist was that the concept was good and the software worked fine but the management needed some attention. The first problem was one of disorientation – “finding my feedback was not straightforward”. We addressed this in the next iteration by using the instructions (in the Workshop settings) and announcements in person and via the News Forum. The second and related problem was to do with lack of notification – “it wasn’t very clear how to use the system and no emails were received”; “working fine but it needs to be improved – notification; instructions”; “I did not receive any alert or instructions on how to check if the feedback from my colleague was in”. We addressed this by putting diary entries for each group leader to notify, remind and instruct participants about what to do at each stage. The third problem was that several participants didn’t receive any reviews – this was because the activity was grouped with a consequently smaller pool or reviewers for each submission, coupled with the fact that it wasn’t a compulsory activity, and exacerbated by the fact that Moodle doesn’t send out alerts when the phases switch e.g. from submission to assessment. We straightforwardly addressed this by removing the groups setting and undertaking to notify students about what to expect and when.

    Decisions, decisions – settings and reasons

    Below are some of the less obvious settings, our rationale, and implications.

    • Grading strategy: Comments – this gives a single field into which participants type or paste summary comments.
    • Grades: none; neither for the submission nor the peer assessment.
    • Instructions for submission: as briefly as possible what participants need to do to make a successful submission.
    • Submissions after the deadline: we left this set to No (unchecked) because rather than manually allocating submissions to reviewers we wanted Moodle to handle this with a scheduled allocation at the submission deadline. Workshop (unlike Turnitin Peermark) does this once only, which means that unless somebody was prepared to go into the Workshop and manually make allocations for late submissions, those late submissions would go unreviewed. Disallowing late submissions would give a very hard cut-off for submissions but greatly reduce the admin burden. This is what we ultimately decided to do, hoping that we could increase participation through good instructions and some scheduled reminders.
    • Instructions for assessment: since the activity required reviewers to leave just a single summary comment, all we did here was direct attention to the guidance on relating the case study to the Professional Services Framework, and remind about the lack of autosave in Moodle form fields.
    • Students may assess their own work:  we left this set to No (unchecked) since one aim of the activity was to share practice.
    • Overall feedback mode: this is the setting that gives a text field for the summary comments; we set it to Enabled And Required.
    • Maximum number of feedback files: set to zero, since we wanted the experience of reading the feedback to be as seamless as possible.
    • Use examples: for this low stakes peer review we didn’t require participants to assess examples.
    • Open for assessment / open for submission: we set the the assessment phase to begin directly as the submission phase closed; this meant that we’d also to set up Scheduled Allocation to run at that time.
    • Switch to the next phase after the submissions deadline: we set this to Yes (checked); in combination with Scheduled Allocation this would reduce the amount of active supervision required on the part of staff.
    • Group mode: we left this set to No Groups. Groups of four (learning sets which we call Quartets) had been set up on Moodle but the previous iteration had shown that when applied to a Workshop (set not to allow self-assessment) the would diminish the pool of possible submissions and possible reviewers, and was vulnerable to non-participation.
    • Grouping: constrasting with Groups, this allows a given activity or resource to be hidden from everyone except the chosen grouping. We’d set up Groupings in the Moodle area corresponding to UCL schools, because the sessions (and therefore the deadlines) for them happen at different times. So we set up Moodle workshops which were duplicates in every respect except the dates.
    • Scheduled allocations: these can be set up via a link from the dashboard.
    • Enable scheduled allocations: Yes (checked) for the reasons above. This would happen once at the end of the Submission Phase.
    • Number of reviews: we set three per submission but if (rather than focusing on ensuring that each submission got three reviews) we wanted to shift the emphasis onto the reviewing process we could have set three per reviewer.
    • Participants can assess without having submitted anything: we left this set to No (unchecked) reasoning that participants were more likely to receive reviews if we kept the pool of reviewers limited to those who were actively participating. (That said, we could do with finding out more about how Workshop allocates reviews if they are set to allocate to reviewers rather than to submissions.)

    Dates for diaries (it doesn’t run itself…)

    That said, where participants are unfamiliar with the process any peer review activity needs quite active supervision. For this reason, CALT staff (who have many other commitments) put dates in their diaries to monitor participation and send reminders, as well as to maintain awareness of which phases the activity was in. Of particular note, to release the feedback to participants a staff member needs to actively close the activity by clicking something in the Workshop dashboard.

    What happened next?

    It went well – see this March 2015 follow-up post including evaluation.

    UCL Engineering’s learning technologist initiative – one year on

    By Jessica Gramp, on 9 October 2014

    UCL Engineering’s Learning Technologists have been supporting rapid changes within the faculty. Changes include the development of several new programmes and helping the uptake of technology to improve the turnaround of feedback.

    In late 2013, the UCL Engineering faculty invested in a Learning Technologist post in order to support the Integrated Engineering Programme (IEP), as well as the other programmes within Engineering departments. Since then two Engineering departments, Science, Technology, Engineering and Public Policy (STEaPP) and Management Science and Innovation (MS&I) have both employed Learning Technologists to help develop their e-learning provision. These posts have had a significant impact on the e-learning activities. To evaluate impact on the student learning experience we are collecting information and feedback from students throughout the academic year.

    These three roles complement the UCL-wide support provided by the E-Learning Environments (ELE) team and the Learning Technologists work closely with the central ELE team. This relationship is facilitated by Jess Gramp, the E-Learning Facilitator for BEAMS (Built Environment, Engineering, Maths and Physical Sciences) who co-manages these roles with a manager from each faculty/department. This arrangement enables both formal input from ELE to the departmental activities and plans; and for the learning technologists to receive central mentoring and assistance. Without this structure in place it would be difficult to keep these roles aligned with the many central e-learning initiatives and for the learning technologists to liaise with the technical teams within ISD.

    The initiatives developed by these staff include: designing and implementing Moodle course templates; ensuring adherence to the UCL Moodle Baseline; running training needs analysis and developing staff training plans; delivering e-learning workshops; working with staff to redesign courses, as well as developing them from the ground up, to incorporate blended learning principles; delivering one-to-one support; and working with academics on e-learning projects.

    Moodle Templates
    Engineering now have a Moodle template that provides a consistent experience for students using UCL Moodle to support their learning. This template is now being used on all IEP, MS&I and STEaPP courses and all new Engineering Moodle courses from 2014/15 onwards will also use this template. In some cases the template has been modified to meet departmental requirements.

    Engineering Faculty Moodle template (click to enlarge)

    Engineering Faculty template

    See how MS&I have modified this template and described each feature in their MS&I Moodle Annotated Template document.

    Moodle Baseline course audit
    In MS&I all Moodle courses have been audited against the UCL Moodle Baseline. This has enabled the department’s Learning Technologist to modify courses to ensure every course in the department now meets the Baseline. The template document that was used to audit the courses has been shared on the UCL E-Learning Wiki, so other departments may use it if they wish to do similar. You can also download it here: Baseline Matrix MSI-template.

    Training Needs Analysis
    In STEaPP a Training Needs Analysis was conducted using both a survey and interviews with academics to develop individual training plans for academics and run training workshops specific to the department’s needs. The survey used for this has been shared with colleagues on the UCL E-Learning Wiki.

    Staff e-learning training and support
    In STEaPP a Moodle Staff Hub course has been developed to support staff in their development of courses, including links to e-learning support materials; curriculum development advice; and links to professional development resources. This course has now been duplicated and modified to assist staff across Engineering and within MS&I. If any other UCL faculties or departments would like a similar course set up they can request this be duplicated for them, so they may tailor it to their own requirements. This and other courses are being used to induct new staff to departments and are supported by face to face and online training programmes. The training is delivered using a combination of central ELE training courses and bespoke workshops delivered by Engineering Learning Technologists.

    E-assessment tools to improve the speed of feedback to students
    In MS&I the average turn around for feedback to students is now just one week, significantly shorter than the four week target set by UCL. In order to support this initiative, the department has adopted a fully online assessment approach. This has been achieved predominately using Turnitin, a plagiarism prevention tool that also provides the ability to re-use comments; use weighted grading criteria to provide consistent feedback to students (in the form of rubrics and grading forms); and mark offline using the iPad app. The use of this tool has helped staff to reach the one week feedback target and to streamline the administrative processes that were slowing the feedback process. The Learning Technologist in MS&I has recently arranged workshops with the majority of MS&I staff (including those who are new to UCL) to demonstrate how Turnitin can be used to deliver feedback quickly to students. Several modules within the IEP are also using Moodle’s Workshop tool to enable peer assessment to be managed automatically online. The use of this and other e-assessment tools is saving academics and support staff significant time that used to be spent manually managing the submission, allocation and marking of assessments.

    Technical e-learning support
    While the ELE Services team continues to be the main point of contact for technical e-learning support within Engineering, the local Learning Technologists are able to provide just-in-time support for staff working on local projects. The Learning Technologists are also able to provide assistance beyond what is supported by the central E-Learning team. This includes any development work, such as setting up specific tools within Moodle courses (like the Workshop tool for peer assessment) and setting up groups in MyPortfolio. Development work like these activities fall outside the remit of the central E-Learning Environments team. Also, because the Engineering Learning technologists are based within the faculty, they obtain a better knowledge of local working practices, and are therefore better equipped to understand and support department specific requirements than the central team is able to.

    Project support and funding
    The local Learning Technologists have worked with academics within Engineering to develop bids for Engineering Summer Studentships and other projects, including the E-Learning Development Grants that are distributed yearly by ELE. The successful project proposals have been supported by the local Learning Technologists, which has meant a greater level of support has been provided to the grant winners than has been possible in previous years.

    Using technology to support scenario-based learning
    The Learning Technologist for STEaPP had a unique opportunity to work with staff during the development of their curriculum to ensure that technology was considered at the very outset of the programme’s development. In MS&I the local Learning Technologist has helped to develop a scenario-based, blended-learning course that is now being used as an exemplar of how other academics may redesign their own courses to empower students in their own learning (both electronically and face to face) and provide authentic learning experiences. Many Engineering programmes are already using project-based work to provide students with authentic learning experiences and assessments and this is something the Learning Technologists can work with academics to develop and enhance further.

    Trialing new technologies
    Several e-learning systems have been trialed within Engineering significant input from the Engineering Learning Technologists, including the mobile e-voting system (TurningPoint ResponseWare) for up to 1000 students; and peer assessment of upwards of 700 student videos within the IEP. The successful implementation of such large scale trials would have been difficult without the support of the Learning Technologists.

    E-Learning equipment loans
    One of the common problems with technology uptake is ensuring staff have access to it. Engineering have invested in a number of devices to enable (amongst other things) offline marking; video capture and editing; and presentation of hand drawn figures during lectures. Equipment is available for loan across Engineering and also within STEaPP and MS&I. These include laptops, video recording and editing kit (such as cameras, tripods, microphones and editing software) and iPads. The maintenance and loaning of these are managed by the local Learning Technologists. They are also able to provide advice and assistance with the use of these devices, especially in terms of multimedia creation, including sound recording and filming, and editing of videos to enhance learning resources.

    Working closely with E-Learning Environments and each other
    One important aspect of these roles is that they have close ties to the ELE team, allowing for important two way communication to occur. The Engineering Learning Technologists are able to keep abreast of changes to centrally supported processes and systems and can obtain support from the central E-Learning Environments Services team when required, including receiving train-the-trainer support in order to run workshops locally within Engineering departments. Similarly, ELE benefit by an improved understanding of the activities occurring within faculties and departments, and accessing the materials that are developed and shared by the Learning Technologists.

    Each week the Engineering Learning Technologists share any developments, issues, and updates with each other and the E-Learning Facilitator for BEAMS. The result is a strong network of support for helping to problem solve and resolve issues. It also enables resources, such as the staff hub Moodle course and Moodle auditing matrix, to be shared across the Faculty and more widely across UCL, enabling the re-use of materials and avoiding duplication of effort. The importance of the strong working relationship between the Engineering Learning Technologists became apparent during UCL Engineering’s How to change the world series. During an important final-day session all three Learning Technologists were involved in resolving technical issues to ensure the voting system operated correctly in a venue with incompatible wireless provision.

    Conclusion
    UCL staff and students today operate within a rapidly changing educational environment. Both staff and students are expected to understand how to use technology in order to operate within an increasingly digital society. There is a huge number of self directed online learning resources available (such as MOOCs and YouTube videos) and increasingly flexible work and study arrangements are being supported by enhanced technology use. As more staff see the benefits that technology can bring to the classroom, and true blended learning becomes the norm in many areas, it is going to be more important to implement appropriate support structures so staff have the resources to understand and work with these emerging technologies. It is equally important that students are supported in their use of these tools.

    The Learning Technologists within Engineering are in a unique position to understand the opportunities and issues arising in the classroom, and react to these quickly and effectively. We have already seen numerous outputs from these roles. These include a video editing guide to help academics produce professional looking videos for their students; the use of tools within Moodle and MyPortfolio on a scale not seen before with large cohorts of over 700 IEP students; and an exemplar of how scenario-based learning can be supported by technology in MS&I. While these outputs have been developed in reaction to local needs, they have been shared back for others to use and reference, and therefore they benefit the wider UCL community.

    As we see more of these roles implemented across UCL, we will begin to see more dramatic change than has been achievable in the past. One of the plans for the future involves running student focus groups and interviews to better understand how Moodle and other e-learning systems are helping students with their studies and how provision can be improved. The Engineering Learning Technologists will continue their work with local staff to help their departments to use technology more effectively and improve the student experience.