Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

  • Subscribe to the Digital Education blog

  • Meta

  • Tags

  • A A A

    Jisc student digital tracker 2017 and BLE consortium

    By Moira Wright, on 10 August 2017

    computer-767776_1920UCL participated in the 2017 Jisc Digital Student Tracker Survey as part of a consortium with the Bloomsbury Learning Environment (BLE) made up of SOAS, Birkbeck, LSHTM and RVC. 74 UK institutions ran the tracker with their students collecting 22,593 student responses, while 10 international universities collected an additional 5,000 student responses

    We were the only consortium to participate in the survey and had come together as a result of institutional surveys, such as the National Student Survey, meaning that the time available to run it independently was short (a month) and we therefore felt that our individual sample sizes would be too small. We treated the survey as a pilot and advertised a link to it on each College’s Moodle landing page as well as some promotion via social media and the Student Unions. The survey generated 330 responses, which given our constraints was much more than we expected.

    The survey comprises five broad areas: Digital access, digital support and digital learning. Most questions were quantitatively recorded, but there were four open questions, which produced qualitative data. We were also able to choose two additional questions to the survey and we selected e-assessment, since that was a previous shared enhancement project (see www.bloomsbury.ac.uk/assessment) and Moodle, since all members of the consortium use the platform for their Virtual Learning Environment (VLE).

    Once the survey closed and we had access to the benchmarking report we ran a workshop for representatives from each of the Colleges in July 2017 whereby the results corresponding to the survey’s open questions were analysed in institutional groups, which facilitated interesting discussions over commonalities and potential implications.

    Sarah Sherman, the BLE Manager and myself, have been working to produce a report which will examine our collective responses to the survey in comparison with the national survey population with a recommendation that individual Colleges independently analyse their own results in more detail. For confidentiality, each College will be presented with a version of this document, which contains the relevant data for their institution only and not the complete BLE data set. A disadvantage of the consortium approach was that we were not able to benchmark individual Colleges to the survey population as the resources would not allow for this. In the future, the participating Colleges may wish to run the survey individually rather than as part of a collective as it was not possible to conduct deep analysis with this data set. 

    markus-spiske-221494

    Although the sample size collected by the Bloomsbury Colleges was small and not statistically viable, there is much we can extract and learn from this exercise. For the most part, our collective responses tended to fall within the margins set by the national survey population, which means we are all at a similar phase in our student’s digital capability and development.

    You will have to wait for the full report for more information on the UCL data collected but just to whet the appetite you can see the key findings from Jisc in this 2 page report: Student digital experience tracker at a glance .

    Finally, you can see this collection of case studies, which features the Bloomsbury Colleges consortium, here.

    Please get in touch with me if you would like to get involved (moira.wright @ ucl.ac.uk)

    Sarah Sherman and Moira Wright

    Jisc/ NUS student digital experience benchmarking tool 

    Jisc guide to enhancing the digital student experience: a strategic approach

     

    Lynda.com Tips and Tricks webinar

    By Caroline Norris, on 1 August 2017

    Scappucinoara Ramodoro from LinkedIn Learning will be hosting two webinars for UCL over the summer.  These sessions are aimed at UCL staff who are involved in promoting Lynda.com to others and who want to gain a better understanding of the key features.  Sara will also be sharing some tips on how to engage learners and maintain their interest.

    Places on these webinars are strictly limited.  If you would be interested in joining, please use the links below.  Note that the content will be the same for both sessions:

    15th August 12 p.m. – 1 p.m.

    13th September 11 a.m. – 12 p.m.

    Save

    Save

    Save

    Save

    Save

    Save

    Turnitin assignments not resetting properly – issue now resolved!

    By Janice K M Kiugu, on 31 July 2017

    At the start of the academic year and in preparation for the next cohort of students on a course, staff are required to ‘Reset’ their Moodle courses. This removes students work and grades but leaves course resources and activities in place.

    An issue was identified at the end of July that was affecting Turnitin assignments when a Moodle course was ‘Reset’. This has now been resolved. Staff can now reset their Moodle courses. HOWEVER, the process of resetting courses has changed slightly so please read through the guidance provided via this link carefully, paying particular attention to step 4 of the process.

    If you reset your course/s containing Turnitin assignments before 11th August, the Digital Education team suggest resetting them again, to ensure the issue described below does not occur.

    *********************************************************************

    An issue has been identified that is affecting Turnitin assignments when a Moodle course is ‘Reset’.

    At the start of the academic year and in preparation for the next cohort of students on a course, staff are required to ‘Reset’ their Moodle courses. This removes students work and grades but leaves course resources and activities in place.

    Issue

    The ‘Reset’ function in Moodle normally creates a new class ID for a Turnitin assignment and staff should then be able to edit the assignment settings accordingly.  The issue that has been identified is that resetting the course seems to ‘lock’ the anonymous marking setting to  ‘Yes’  making it un-editable.  However, even if the Post date is edited and no submission has been made, student’s names are visible.

    We have reported the issue to Turnitin and they have acknowledged that there is problem and indicated that they hope to have a solution we can implement by the end of the week.

    Action Required (temporary workaround)

    We recommend, where feasible, that you wait until we have a fix in place and refrain from ‘resetting’  your course until we advise otherwise.

    Staff who have reset their courses in preparation for the next cohort of students or any staff planning to reset their courses before the issue is resolved should take the following steps:

    After resetting your Moodle course:

    • Delete the Turnitin assignment(s) that currently exist
    • Create new Turnitin assignment (s) with the required settings

      Guidance and instructions on creating Turnitin assignments are available from our Moodle Resource Centre: https://wiki.ucl.ac.uk/display/MoodleResourceCentre/M20+-+Turnitin+Assignment

    We apologise for the inconvenience caused and will advise when the issue has been resolved.

    If you have any questions of concerns, please contact the Digital Education team by emailing
    digi-ed@ucl.ac.uk

     

    Assessment in Higher Education conference, an account

    By Mira Vogel, on 25 July 2017

    Assessment in Higher Education is a biennial conference which this year was held in Manchester on June 28th and 29th. It is attended by a mix of educators, researchers and educational developers, along with a small number of people with a specific digital education remit of one kind or another (hello Tim Hunt). Here is a summary – it’s organised it around the speakers so there are some counter-currents. The abstracts are linked from each paragraphy, and for more conversation see the Twitter hashtag .

    Jill Barber presented on adaptive comparative judgement – assessment by comparing different algorithmically-generated pairs of submissions until saturation is reached. This is found to be easier than judging on a scale, allows peer assessment and its reliability bears up favourably against expert judgement.  I can throw in a link to a fairly recent presentation on ACJ by Richard Kimbell (Goldsmiths), including a useful Q&A part which considers matters of extrapolating grades, finding grade boundaries, and giving feedback. The question of whether it helps students understand the criteria is an interesting one. At UCL we could deploy this for formative, but not credit-bearing, assessment – here’s a platform which I think is still free. Jill helpfully made a demonstration of the platform she used available – username: PharmEd19 p/ wd: Pharmacy17.

    Paul Collins presented on assessing a student-group-authored wiki textbook using Moodle wiki. His assessment design anticipated many pitfalls of wiki work, such as tendency to fall back on task specialisation, leading to cooperation rather than collaboration (where members influence each other – and he explained at length why collaboration was desirable in his context), and reluctance to edit others’ work (which leads to additions which are not woven in). His evaluation asked many interesting questions which you can read more about in this paper to last year’s International Conference on Engaging Pedagogy. He learned that delegating induction entirely to a learning technologist led students to approach her with queries – this meant that the responses took on a learning technology perspective rather than a subject-oriented one. She also encouraged students to keep a word processed copy, which led them to draft in Word and paste into Moodle Wiki, losing a lot of the drafting process which the wiki history could have revealed. He recommends lettings students know whether you are more interested in the product, or the process, or both.

    Jan McArthur began her keynote presentation (for slides see the AHE site) on assessment for social justice by arguing that SMART (specific, measurable, agreed-on, realistic, and time-bound) objectives in assessment overlook precisely the kinds of knowledge which are ‘higher’ – that is, reached through inquiry; dynamic, contested or not easily known. She cautioned about over-confidence in rubrics and other procedures. In particular she criticised Turnitin, calling it “instrumentalisation\ industrialisation of a pedagogic relationship” which could lead students to change something they were happy with because “Turnitin wasn’t happy with it”, and calling its support for academic writing “a mirage”. I don’t like Turnitin, but felt it was mischaracterised here. I wanted to point out that Turnitin has pivoted away from ‘plagiarism detection’ in recent years, to the extent that it is barely mentioned in the promotional material. The problems are where it is deployed for policing plagiarism – it doesn’t work well for that. Meanwhile its Feedback Studio is often appreciated by students, especially where assessors give feedback specific to their own work, and comments which link to the assessment criteria. In this respect it has developed in parallel with Moodle Assignment.

    Paul Orsmond and Stephen Merry summarised the past 40 years of peer assessment research as ’80s focus on reliability and validity, ’90s focus on the nature of the learning, and a more recent focus on the inseparability of identity development and learning – a socio-cultural approach. Here they discussed their interview research, excerpting quotations and interpreting them with reference to peer assessment research. There were so many ideas in the presentation I am currently awaiting their speaker notes.

    David Boud presented his and Philip Dawson’s work on developing students’ evaluative judgement. Their premise is that the world is all about evaluative judgement and understanding ‘good’ is a premise to producing ‘good’, so it follows that assessment should be oriented to informing students’ judgments rather “making unilateral decisions about students”. They perceived two aspects of this approach: calibrating quality through exemplars, and using criteria to give feedback, and urged more use of self-assessment, especially for high-stakes work. They also urged starting early, and cautioned against waiting until “students know more”.

    Teresa McConlogue, Clare Goudy and Helen Matthews presented on UCL’s review of assessment in a research intensive university. Large, collegiate, multidisciplinary institutions tend to have very diverse data corresponding to diverse practices, so reviewing is a dual challenge of finding out what is going on and designing interventions to bring about improvements. Over-assessment is widespread, and often students have to undertake the same form of assessment. The principles of the review included focusing on structural factors and groups, rather than individuals, and aiming for flexible, workload-neutral interventions. The work will generate improved digital platforms, raised awareness of pedagogy of assessment design and feedback, and equitable management of workloads.

    David Boud presented his and others’ interim findings from a survey to investigate effective feedback practices at Deakin and Monash. They discovered that by half way through a semester nearly 90% of students had not had an assessment activity. 70% received no staff feedback on their work before submitting – more were getting it from friends or peers. They also discovered skepticism about feedback – 17% of staff responded they could not judge whether feedback improved students’ performance, while students tended to be less positive about feedback the closer they were to completion – this has implications for how feedback is given to more advanced undergraduate students. 80% of students recognised that feedback was effective when it changed them. They perceived differences between indvidualised and personalised feedback. When this project makes its recommendations they will be found on its website.

    Head of School of Physical Science at the OU Sally Jordan explained that for many in the assessment community, learning analytics is a dirty word, because if you go in for analytics, why would you need separate assessment points? Yet analytics and assessment are likely to paint very different pictures – which is right? She suggested that, having taken a view of assessment as ‘of’, ‘for’ and ‘as’ learning, the assessment community might consider the imminent possibility of ‘learning as assessment’. This is already happening as ‘stealth assessment‘ when students learn with adaptable games.

    Denise Whitelock gave the final keynote (slides on the AHE site) asking whether assessment technology is a sheep in wolf’s clothing. She surveyed a career working at the Open University on meaningful automated feedback which contributes to a growth mindset in students (rather than consolidating a fixed mindset). The LISC project aimed to give language learners feedback on sentence translation – immediacy is particularly important in language learning to avoid fossilisation of errors. Another project, Open Mentor, aimed to imbue automated feedback with emotional support using Bales’ interaction process categories to code feedback comments. The SAFeSEA project generated Open Essayist which aims to interpret the structure and content of draft essays, identifies key words, phrases and sentences, identifies summary, conclusion and discussion, and presents these to the author. If Open Essayist has misinterpreted the ideas in the essay, the onus is on the author to make amendments. How it would handle some more avant-garde essay forms I am not sure – and this also recalls Sally Jordan’s question about how to resolve inevitable differences between machine and  human judgement. The second part of the talk set out and gave examples of the qualities of feedback which contributes to a growth mindset.

    I presented Elodie Douarin’s and my work on enacting assessment principles with assessment technologies – a project to compare the feedback capabilities of Moodle Assignment and Turnitin Assignment for engaging students with assessment criteria.

    More blogging on the conference from Liz Austen, Richard Nelson, and a related webinar on feedback.

    Live Captioning and Video Subtitling Service Available

    By Michele Farmer, on 18 July 2017

    Live Captioning for Lectures and Events

    121 Captions is now available to use for these services – they have been added to MyFinance as a service provider.

    For live captioning the process is as follows:

    Book the number of hours you wish from the company – they will only charge you for these hours.

    You will need a laptop set up with a Skype account to connect to their system (the Skype to Skype call is free) along with a Skype Mic (the Disability IT Support Analyst has one for loan if needed). The laptop would preferably be hardwired (via an Ethernet port) to our network – you may need to get a port patched for this, but ISD Network Services will be able to help – please book a job through the Service Desk. Client Platform Services can also help with setting this up – if needed though, please give both services advance warning and log a job through the Service Desk to make sure they are available.  However, in many cases, wireless internet connections should work, but it is best to run a test beforehand to make sure the signal is strong enough.

    The speaker will need to be informed of the setup so they do not wander too far away from the mic whilst giving their speech.

    The end user(s) will need a device (laptop, tablet) and the link (which the company will provide) to be able to view the captions.

    The end user’s screen can be modified (font, colour, etc) and will also have a chat window to feed back to the typist if there are any issues.

    Captioning screen before any font or colour changes

    Captioning screen before any font or colour changes.

    The image above shows an example of basic layout before colour and font modifications.

    The company will also provide you with a transcript afterwards.

    *They also have a new service called 1Fuzion which allows display of captions and PowerPoint slides on the same screen.

    Subtitle your videos for YouTube, Vimeo, staff / college / university intranet

    Upload your video to YouTube (using a private or unlisted setting if necessary). They will download the video directly, subtitle it and email a professional subtitle format file back (with colours / positioning / emphasis / sound effects etc). They just charge for the subtitling and email the subtitle file, which you upload to YouTube in your Video Manager (it takes 30 seconds!). They can send instructions on how to upload to YouTube or Vimeo. Your video can also be embedded on an internal intranet site. The viewer can turn closed caption subtitles on and off as needed without any burning process.

    You would just need to tell them whether you need a closed caption file or a video file with open captions burnt in, whether you want them to use UK or US English spellings, or if you’d like a version with each.

    For an example of how it works in practice, take a look at the RSA shorts on YouTube, switching on the “English captions” using the icon third from right on the toolbar.

    They will email your file usually within 24 hours, Monday to Friday. Orders received after midday on Friday will be delivered by midday on Monday. Weekend delivery can often be arranged, so please contact them if you’re in a rush: bookings@121captions.com

    Closed Captions

    • Subtitles can be turned on and off, as the viewer requires.
    • Suitable for YouTube, Vimeo, staff intranet.
    • Text is searchable by major search engines.
    • Useful to provide viewers with an option to choose subtitles if they struggle to hear your soundtrack or want to watch with the sound off.
    • Least expensive option: They simply provide you with a professional-format timed subtitle file, which you upload to your video.
    • On YouTube, viewers can set the font and size of subtitles which they find easiest to read.
    • Open Captions
    • Subtitles are burnt into your video and are permanently visible.
    • Suitable for all web video platforms.
    • Text is not searchable by Google and YouTube.
    • Useful to provide open access to your video for all, or if your sound isn’t the best quality.
    • More expensive option: As burning-in subtitles involves additional production processes, including the creation and transfer of a new video file, there is an additional cost.
    • You have control over the font and size of the subtitles, which the viewer can’t change.

    Applying Universal Design for Learning (UDL) Principles to VLE design

    By Jessica Gramp, on 16 July 2017

    Universal Design for Learning (UDL) Principles describe how educators can cater to the needs of students with differing needs, including those with disabilities (CAST 2011). It stems from the social model of disability, which places the problem within the environment, rather than with the individual who has the disability (Collins 2014).
    Technology enables the quick modification of learning materials to meet the specific needs of students (Pisha & Coyne 2001) and online communication can even hide a disability from others. For example, a deaf student who participates in an online discussion forum does not need to reveal they are deaf in order to communicate with peers. This can lower the social and communication barriers that may be experienced when communicating in person. Also, there are many modern technologies specifically developed to help people with disabilities engage with online environments. This means online learning environments are particularly well placed to address the goal of Universal Design for Learning. It is the responsibility of the institutions and developers who maintain these environments to ensure they can be accessed by all.
    While most of the UDL guidelines apply to curriculum design, some of them are relevant to the design of the broader virtual learning environment (VLE).

    UDL principles (CAST 2011) mapped to how a VLE might meet relevant checkpoints

    To learn more, click on one of the Guidelines in the boxes below.

    I. Provide Multiple Means of Representation

    PerceptionLanguage, expressions, and symbolsComprehension

    II. Provide Multiple Means of Action and Expression

    Physical actionExpression and communication
    Executive function

    UDL Principle 1 aims to ‘provide multiple means of representation’  by ‘providing options for perception’, which includes ‘offer[ing] ways of customizing the display of information’ (CAST 2011). This means the VLE should offer the ability to do things like resize text and enable screen-readers to read aloud text to those who have visual impairments or dyslexia.

    Within UDL Principle 2, guideline 4: aims to ‘provide options for physical action’, which includes ‘vary[ing] the methods for response and navigation’ (CAST 2011). This means ensuring all navigation and interaction can occur via a keyboard and using assistive technologies such as voice activated software like Dragon NaturallySpeaking, which recognises speech and converts it to text.
    UDL Principle 3 seeks to ‘provide multiple means of engagement’ by ‘recruiting interest’, including enabling the learner to choose colours and layouts (CAST 2011). There are a number of tools that enable users to change the fonts and colours on a webpage and it is important these are able to be applied. The VLE should also offer the ability to customise the interface, in terms of re-ordering frequently accessed items, placement of menus and temporarily hiding extraneous information that may distract from the task at hand.
    These three principles and the specific checkpoints mentioned above are being addressed as part of the Accessible Moodle project, which aims to make UCL Moodle more accessible. The main ways these are being addressed are through the development of a more accessible Moodle theme, as well as the development of Moodle code itself. Although the project has limited ability to develop this code, suggestions for improvements are being raised with the Moodle development community via the Moodle Tracker. You can sign up and vote for accessibility enhancements to help these get prioritised, and therefore resolved more quickly, by Moodle HQ and other developers within the community.
    The remaining UDL principles are intended to guide the development of more accessible content and curriculum designs, and therefore these will inform the development of the Universal Design for Learning course that is being developed at UCL, to help educators understand how to design accessible learning tasks, environments and materials.
     
    You can read more about the Accessible Moodle project on the UCL Digital Education blog.
     
    References
    CAST (2011). Universal Design for Learning Guidelines version 2.0. [online]. Available from: http://www.udlcenter.org/sites/udlcenter.org/files/UDL_Guidelines_Version_2.0_(Final)_3.doc [Accessed 16 July 2017].
    Collins, B. (2014). Universal design for learning: What occupational therapy can contribute? [Online]. Occupational Therapy Now, 16(6), 22-23. Available from: http://eprints.bournemouth.ac.uk/21426/1/Collins.pdf [Accessed 16 July 2017].
    Pisha, B. & Coyne, P. (2001) Smart From the Start: The Promise of Universal Design for Learning. Remedial and Special Education. [Online] 22 (4), 197–203. Available from: doi:10.1177/074193250102200402.