Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

  • Subscribe to the Digital Education blog

  • Meta

  • Tags

  • A A A

    Archive for the 'Evaluation' Category

    Jisc student digital tracker 2017 and BLE consortium

    By Moira Wright, on 10 August 2017

    computer-767776_1920UCL participated in the 2017 Jisc Digital Student Tracker Survey as part of a consortium with the Bloomsbury Learning Environment (BLE) made up of SOAS, Birkbeck, LSHTM and RVC. 74 UK institutions ran the tracker with their students collecting 22,593 student responses, while 10 international universities collected an additional 5,000 student responses

    We were the only consortium to participate in the survey and had come together as a result of institutional surveys, such as the National Student Survey, meaning that the time available to run it independently was short (a month) and we therefore felt that our individual sample sizes would be too small. We treated the survey as a pilot and advertised a link to it on each College’s Moodle landing page as well as some promotion via social media and the Student Unions. The survey generated 330 responses, which given our constraints was much more than we expected.

    The survey comprises five broad areas: Digital access, digital support and digital learning. Most questions were quantitatively recorded, but there were four open questions, which produced qualitative data. We were also able to choose two additional questions to the survey and we selected e-assessment, since that was a previous shared enhancement project (see www.bloomsbury.ac.uk/assessment) and Moodle, since all members of the consortium use the platform for their Virtual Learning Environment (VLE).

    Once the survey closed and we had access to the benchmarking report we ran a workshop for representatives from each of the Colleges in July 2017 whereby the results corresponding to the survey’s open questions were analysed in institutional groups, which facilitated interesting discussions over commonalities and potential implications.

    Sarah Sherman, the BLE Manager and myself, have been working to produce a report which will examine our collective responses to the survey in comparison with the national survey population with a recommendation that individual Colleges independently analyse their own results in more detail. For confidentiality, each College will be presented with a version of this document, which contains the relevant data for their institution only and not the complete BLE data set. A disadvantage of the consortium approach was that we were not able to benchmark individual Colleges to the survey population as the resources would not allow for this. In the future, the participating Colleges may wish to run the survey individually rather than as part of a collective as it was not possible to conduct deep analysis with this data set. 

    markus-spiske-221494

    Although the sample size collected by the Bloomsbury Colleges was small and not statistically viable, there is much we can extract and learn from this exercise. For the most part, our collective responses tended to fall within the margins set by the national survey population, which means we are all at a similar phase in our student’s digital capability and development.

    You will have to wait for the full report for more information on the UCL data collected but just to whet the appetite you can see the key findings from Jisc in this 2 page report: Student digital experience tracker at a glance .

    Finally, you can see this collection of case studies, which features the Bloomsbury Colleges consortium, here.

    Please get in touch with me if you would like to get involved (moira.wright @ ucl.ac.uk)

    Sarah Sherman and Moira Wright

    Jisc/ NUS student digital experience benchmarking tool 

    Jisc guide to enhancing the digital student experience: a strategic approach

     

    Accessible Moodle wishlist

    By Jessica Gramp, on 20 June 2017

    The following outlines recommendations from the Accessible Moodle project to improve the accessibility of UCL Moodle for disabled students and staff, as well as improve usability for all users. These have been informed by focus groups with disabled students and staff; analysis of how UK websites adhere to accessibility guidelines; and research of relevant journal articles and accessibility guidelines.

    Our primary aim is to ensure Moodle is technically accessible using assistive technologies including ZoomText, JAWS screen-reader, Read & Write, Dragon NaturallySpeaking voice recognition software, as well as other assistive technologies commonly used at UCL. In addition, keyboard-only access should be fully supported. It is also important that UCL Moodle is usable for those with disabilities, as well as the wider student and staff community.

    In order to develop these recommendations, the project team ran focus groups with UCL students and staff with disabilities, to find out what they found difficult to use within Moodle and what suggestions they had for improvements. I have blogged previously about the background to the project and the outcomes of these focus groups.

    A number of sources were also referenced to see how Moodle could be made to better adhere to accessibility guidelines. The most important of these are the following three guidelines from the World Wide Web Consortium (W3C) :

    • Web Content Accessibility Guidelines (WCAG) 2.0 Level AA for making Moodle and its content more accessible.
    • Web Accessibility Initiative – Accessible Rich Internet Applications Suite (WAI-ARIA) for designing Moodle so users of assistive technologies, like screen-readers, can navigate and read its pages.
    • Authoring Tool Accessibility Guidelines (ATAG) for making the Moodle rich text editors more accessible.

    A number of websites were also analysed to compare how each of them implemented W3C guidelines.

    The list that follows is a wish list, which may not all be implemented, but gives us a guide for how we might improve Moodle. Although there are many other elements that are important, but not mentioned below, the following makes a start of improving the interface for disabled  and non-disabled users alike.

    We are taking a multi-faceted approach to resolve the issues identified, and work is likely to be ongoing, but here’s a list of changes we’d like to see made to make Moodle more accessible.

    Assistive Technology compatibility.

    The following recommendations are likely to require implementation at multiple levels, so don’t easily fit under any single development areas below. The project aims to achieve the following:

    • Content and editing features are available to screen-readers, or suitable alternatives are available – e.g. offline marking in Word enables in-line marking for assessments.
    • Navigation is straight-forward, with content appearing before menus and appropriate headings, links and lists being utilised to enable easy navigation using common screen-reader features. E.g. the list of module tutor names under every Moodle course name in the search results means that hundreds of links are listed to screen-reader users and sighted users are overwhelmed by irrelevant information which needs to be scrolled past, and which is particularly problematic for those with dyslexia.
    • All images have alt tags (even if these are empty), or in the case of icons that supplement text, they use ARIA tags to tell screen-readers to ignore them.
    • Accepts user input using voice recognition software, like Dragon Naturally Speaking.
    • Enables magnification by ensuring the pages display well when the browser is zoomed in or when zooming software is used.
    • Visible focus when using the keyboard (tab, space, enter and arrow keys) to navigate.
    • Supports the use of OpenDyslexi font, available as a browser plugin to help those with dyslexia read text.

    A multi-faceted approach

    The following five areas outline the different ways in which Accessibility improvements can be made to UCL Moodle.

    1. A new, more accessible UCL Moodle theme for use on desktop and mobile devices.
      • Minimise clutter, by enabling blocks to be hidden and removing extraneous information.
      • Position elements for optimal access. E.g. ensure the login is prominent and important course features are easy to access.
      • Simplify the menus, by showing relevant links only to relevant users. E.g. staff see different links from students.
      • Improve the course icons by making them larger and clearer. E.g. the maximise block link is not intuitive.
      • Show alerts to users – e.g. explaining that editors can drag and drop files, warnings of Moodle outage periods.
      • Improve navigation, e.g. by enabling links to key areas that users expect.
      • Use high contrasting colours on a pale background that is easy to read for those with dyslexia (e.g. not white).
    2. Changes to Moodle configuration.
      • Configure text editors so they encourage accessible content design. E.g. offering heading styles 3-5, removing the inclination for people to add heading 1 and 2 tags when these are used at higher levels within Moodle pages.
      • Enable global search (assuming this does not negatively impact performance).
      • Allow students and staff to personalise the interface by enabling courses to be moved up and down on the My Home page, hide and show blocks, maximise the screen or use a default width better for reading and dock blocks.
    3. Enhanced Moodle features.
      A number of plugins to Moodle exist that make Moodle more usable and improve accessibility.

      • Implement and configure user tours to help users understand how to use Moodle and point to help with accessibility features.
      • Install the course checks plugin to help staff create an accessible Moodle course – e.g. checks for assignment due dates in past, courses not reset, broken links.
      • Implement a Moodle course site map so students can easily see what is available on a course on one page.
      • Enable importing content from Word, which some users find easier to edit within than Moodle.
      • Pilot the Blackboard Ally plugin to help in the creation of more accessible learning resources and course structures.
      • Install the Office 365 plugin to make it easier to author, organise and link or embed content into Moodle (coming to Moodle core in v3.3).
      • Enable staff to add icons to help signpost particular areas of their course and help people who prefer these visual cues, as opposed to having to read excessive text.
    4. Improved training, staff development and support.
      • Develop a course for Moodle editors so they understand how to develop accessible Moodle resources and activities.
      • Develop an online course to explain how Assistive Technologies can be used with Moodle (e.g. regions for JAWS, browser plugins to show a reading ruler, change fonts to OpenDyslexi font, improve colour contrast).
    5. Improved interfaces by proposing enhancements to Moodle HQ and iParadigms (who provide Turnitin).
      • Adequately signpost links showing (new window, document, external/internal etc) automatically.
      • Enable users to personalise their experience by allowing them to choose their own course format, set blocks to particular colours.
      • Improve assessment interfaces, such as the Moodle Assignment rubric functionality and display.
      • Flag new items on the Moodle course page (allow this to be enabled/disabled in user preferences).
      • Improve the Moodle calendar – e.g. size, reliance on colour, clicking month to access full screen.
      • Improve the discussion forums – e.g. showing the entire thread when replying, the accessibility of the email alerts it sends.
      • Fix Moodle heading tags.

    The UCL Digital Education team, staff in Disability Support teams and staff from IT for IoE  are slowly working through each of these five strands to make improvements to virtual learning experiences at UCL for those with disabilities. Many of these improvements will also benefit other Moodle users, since accessibility cannot be considered in isolation from usability, so this means an enhanced user experience for everyone!

    Comparing Moodle Assignment and Turnitin for assessment criteria and feedback

    By Mira Vogel, on 8 November 2016

    Elodie Douarin (Lecturer in Economics, UCL School of Slavonic and Eastern European Studies) and I have been comparing how assessment criteria can be presented to engage a large cohort of students with feedback in Moodle Assignment and Turnitin Assignment (report now available). We took a mixed methods approach using questionnaire, focus group and student screencasts as they accessed their feedback and responded to our question prompts. Here are some our key findings.

    Spoiler – we didn’t get a clear steer over which technology is (currently) better – they have different advantages. Students said Moodle seemed “better-made” (which I take to relate to theming issues rather than software architecture ones) while the tutor appreciated the expanded range of feedback available in Moodle 3.1.

    Assessment criteria

    • Students need an opportunity to discuss, and ideally practice with, the criteria in advance, so that they and the assessors can reach a shared view of the standards by which their work will be assessed.
    • Students need to know that criteria exist and be supported to use them. Moodle Assignment is good for making rubrics salient, whereas Turnitin requires students to know to click an icon.
    • Students need support to benchmark their own work to the criteria. Moodle or Turnitin rubrics allow assessors to indicate which levels students have achieved. Moreover, Moodle allows a summary comment for each criterion.
    • Since students doubt that assessors refer to the criteria during marking, it is important to make the educational case for criteria (i.e. beyond grading) as a way of reaching a shared understanding about standards, for giving and receiving feedback, and for self/peer assessment.

    Feedback

    • The feedback comments most valued by students explain the issue, make links with the assessment criteria, and include advice about what students should do next.
    • Giving feedback digitally is legible and easily accessible from any web connected device.
    • Every mode of feedback should be conspicuously communicated to students and suggestions on how to cross-reference these different modes should be provided. Some thoughts should be given to ways to facilitate access to and interpretation of all the elements of feedback provided.
    • Students need to know that digital feedback exists and how to access it. A slideshow of screenshots would allow tutors to hide and unhide slides depending on which feedback aspects they are using.

    Effort

    • The more feedback is dispersed between different modes, the more effortful it is for students to relate it to their own work and thinking. Where more than one mode is used, there is a need to distinguish between the purpose and content of each kind of feedback, signpost their relationships, and communicate this to students. Turnitin offers some support for cross referencing between bubble comments and criteria.
    • It would be possible to ask students to indicate on their work which mode (out of a choice of possibilities) they would like assessors to use.
    • The submission of formative assessment produced with minimal effort may impose a disproportionate burden on markers, who are likely to be commenting on mistakes that students could have corrected easily by themselves. Shorter formative assessment, group works, clearer statements of the benefits of submitting formative work may all help limiting the incidence of low-effort submissions.
    • If individual summary comments have a lot in common, consider releasing them as general feedback for the cohort, spending the saved time on more student-specific comments instead. However, this needs to be signposted clearly to help students cross-reference with their individual feedback.
    • As a group, teaching teams can organise a hands-on session with Digital Education to explore Moodle Assignment and Turnitin from the perspectives of students, markers and administrators. This exposure will help immeasurably with designing efficient, considerate processes and workflows.
    • The kind of ‘community work’ referred to by Bloxham and colleagues (2015) would be an opportunity to reach shared understandings of the roles of students and markers with respect to criteria and feedback, which would in turn help to build confidence in the assessment process.

     

    Bloxham, S., den-Outer, B., Hudson, J., Price, M., 2015. Let’s stop the pretence of consistent marking: exploring the multiple limitations of assessment criteria. Assessment & Evaluation in Higher Education 1–16. doi:10.1080/02602938.2015.1024607

     

    What IT Directors care about

    By Fiona Strawbridge, on 30 October 2016

    IMG_7849I heard about the Campus Computing survey for the first time at Educause 2016 – but this survey has been around since 1990 – before, I suspect, the term e-learning had even been coined. This is a survey of CIOs’ (IT Directors’) perspectives on e-learning, amongst other things and I was intrigued to find out what they thought, so went to hear about it from Casey (Kenneth) Green, the Founding Director of CampusComputing.net. I haven’t managed to find the actual survey report, so what follows is a bit patchy, but in essence, CIOs’ have ‘great faith in the benefits of e-learning’, but Learning Analytics keeps them up at night.

    Their top five priorities are:

    1. hiring and retaining skilled staff;
    2. assisting academics with e-learning;
    3. the network and data security;
    4. providing adequate user support;
    5. leveraging IT resources to support student success.

    The trouble with learning analytics:

    CIOs are consistently bothered about their institutions’ ability to deliver learning analytics capabilities and cited concerns with:

    • the infrastructure to deliver them;
    • effectiveness of investment to date;
    • sense of satisfaction with what has been delivered

    There was a general sense that their ‘reach exceeded their grasp’ in this area.

    What we do vs what we buy:

    An interesting observation was that CIOs’ rating of services and facilities that are bought in or outsourced was higher than of those that are developed in house. ‘What we buy works better than what we do’.  Which is perhaps unsurprising, but rather depressing. The service that CIOs were happiest about was wifi!

    If I manage to get a link to the report or presentation I will link to it here.

    What (American) Students Want

    By Fiona Strawbridge, on 30 October 2016

    Infographic of ECAR Survey - https://library.educause.edu/resources/2016/6/~/media/files/library/2016/10/eig1605.pdf

    ECAR infographic

    One motivation for enduring the jet lag and culture shock of a long haul conference is the chance to find out what the big issues are in a different HE environment; Educause is a very good opportunity to do that as it reports on a number of surveys in the world’s largest higher education sector.

    So, at this year’s Educause in LA, I went to sessions reporting the results of two very different surveys. One – the ECAR (Educause Center for Analysis & Research) Student Survey – asks students themselves about their attitudes to, experiences of and preferences for using technology in HE – a bit like a tech-focused NSS. The second – CampusComputing.net – surveys IT Directors’ views on e-learning; this seemed, to me, to be a rather odd perspective (why ask CIOs and not heads of e-learning who are closer to the area?).  This post looks at the ECAR student view. To find out what the directors want I’ve written a more sketchy post…

    The student survey was completed by a staggering 71,641 students from 183 institutions in 37 states and from 12 countries. The survey is a good benchmarking tool for participating institutions – they are able to compare their results against those from other institutions. Christopher Brookes and Jeffrey Pomerania from Educause presented a whistle-stop tour of the main findings. The full report is at the survey hub, and the infographic shown on the right is a nice summary. There weren’t too many surprises; in a nutshell, students own a lot of devices, and they view them as very important for their learning.

    Their devices

    In terms of devices, 93% own laptops and a further 3% plan to purchase one, and almost all say they are very or extremely important for their studies. 96 % own smartphones. Tablet ownership is much lower at 57%, and students rated them as less important to their studies than their smartphones. 61% of students have two or three devices, and 33% own four or more. Challenging for wifi, as we know…

    Techiness

    ECAR looked at techiness (sic) as measured by students’

    1. disposition to technology (sceptic vs cheerleader, technophobe vs technophile etc);
    2. their attitude (distraction vs enhancement, discontented vs contented etc) and
    3. their actual usage of technology (peripheral vs central, never vs alway connected etc).

    Since 2014 all three measures have increased – so students are more techie now, and men are more techie than women. As I said, no great surprises.

    Students’ experiences of technology

    We were told that there was good news about students’ experiences of technology – 80% rated their overall technology experience as good or excellent. Now, it strikes me that if our scores for question 17 in the National Student Survey which asks about technology had been this low (we score 87%) we’d be very seriously concerned – but of course the questions are different so a direct comparison isn’t valid. But a good question is what is actually meant by “students’ experience of technology”. We were told that the main determinants were wifi in halls of residence and on campus, ease of login, having skilled academics, students’ own attitudes to technology, and it helped if technology used in class was perceived as relevant to their career.

    Technology in teaching

    Around 69% of students said that their teachers had adequate technical skills. More than half reported that technology was being used to share materials (61%) and collaborate (57%). There was less use which encouraged critical thinking (49%) and only a 34% of students said they were encouraged to use their own technology in the classroom.

    82% of students reported preferring a blended learning environment over a fully online or fully offline one. Since 2013, the percentage of students who don’t want any online education has halved from around 22% to 11%. The number wanting a fully online experience has dropped slightly, but the number wanting a ‘nearly fully online’ experience has increased; the number wanted a more traditionally blended approach is stable at around 60%. Those who have previous experience of fully online courses are more likely to want a more fully online experience, and women were more likely than men to want to learn online – it was suggested due to a reluctance to speak up in a face-to-face environment.

    Students found technology helped them with engagement with academics, with one another, and with content. There were some other interesting demographic effects. Women, first generation students, and non-white students were more likely to say that technology had a positive impact on the efficacy of their learning – it empowered them; it was helpful for communication, for helping them with basic terminology, and for getting swift feedback from others. It was found to enrich the learning experience in many ways.

    And finally, students want more:

    • Lecture capture – this mirrors experience at UCL
    • Free, supplemental online content
    • Search tools to find references – this has digital literacy implications as tools exist so perhaps students are unaware.

    But, I guess, not more engaging or challenging online learning experiences. Ah well…

    Introducing the new E-Learning Baseline

    By Jessica Gramp, on 7 June 2016

    UCL E-Learning Baseline 2016The UCL E-Learning Baseline is now available as a printable colour booklet. This can be downloaded from the UCL E-Learning Baseline wiki page: http://bit.ly/UCLELearningBaseline

    The 2016 version is a product of merging the UCL Moodle Baseline with the Student Minimum Entitlement to On-Line Support from the Institute of Education.

    The Digital Education Advisory team will be distributing printed copies to E-Learning Champions and Teaching Administrators for use in departments.

    Please could you also distribute this to your own networks to help us communicate the new guidelines to all staff.

    Support is available to help staff apply this to their Moodle course templates via digi-ed@ucl.ac.uk.

    We are also working on a number of ideas to help people understand the baseline (via a myth busting quiz) and a way for people to show their courses are Baseline (or Baseline+) compliant by way with a colleague endorsed badge.

    See ‘What’s new?’, to quickly see what has changed since the last 2013 Baseline.