X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Archive for the 'Accessibility' Category

Introducing the new E-Learning Baseline

By Jessica Gramp, on 7 June 2016

UCL E-Learning Baseline 2016The UCL E-Learning Baseline is now available as a printable colour booklet. This can be downloaded from the UCL E-Learning Baseline wiki page: http://bit.ly/UCLELearningBaseline

The 2016 version is a product of merging the UCL Moodle Baseline with the Student Minimum Entitlement to On-Line Support from the Institute of Education.

The Digital Education Advisory team will be distributing printed copies to E-Learning Champions and Teaching Administrators for use in departments.

Please could you also distribute this to your own networks to help us communicate the new guidelines to all staff.

Support is available to help staff apply this to their Moodle course templates via digi-ed@ucl.ac.uk.

We are also working on a number of ideas to help people understand the baseline (via a myth busting quiz) and a way for people to show their courses are Baseline (or Baseline+) compliant by way with a colleague endorsed badge.

See ‘What’s new?’, to quickly see what has changed since the last 2013 Baseline.

 

Meet Jess, Jack, Stuart & Heather – realistic voices for free* download

By Jessica Gramp, on 3 March 2015

I have recently started listening to my books and papers, rather than reading them. This frees me up to do other things while I listen, such as cook, take a bath or do some tidying up. It also gives my eyes a well needed break from staring at a computer screen or paper.

As part of an online e-learning course I am helping to develop, I am using the TechDis Jess voice to provide audio files of the commentary, as an alternative to reading. I have had to tweak some of the text – for example, UCL needs to be written with spaces between each letter in order for Jess to pronounce each letter individually and I needed to add a hyphen to CMALT (C-MALT) for it to be pronounced correctly. But for the most part I can leave the text much as it is typed. I then run it through a free, open source software called Balabolka to produce an audio file that participants on the course can download and listen to.

TechDis Jess and other UK voices (including Scottish and Welsh options) are available from www.heacademy.ac.uk/jisc-techdis-voices.

Balabolka is available from: www.cross-plus-a.com/balabolka.htm.
Listen to a sample:

Listen on SoundCloud…

*Staff and learners studying at England’s UK and FE institutions can download the voices free of charge and those at Scottish and Welsh institutions can download local voices.

MyPortfolio upgrade on 12.08.14

By Domi C Sinclair, on 5 August 2014

MyPortfolio will be unavailable on 12 August 2014 from 8 AM to 10 AM whilst we carry out a routine upgrade.

On 12 August 2014 we will upgrade MyPortfolio to 1.9.2. There are many benefits to this upgrade, including improved accessibility, support for Creative Commons 4.0  and sorting of files within a folder.

 Improved accessibility –  W3C WCAG 2.0 level AA

Creative Commons 4.0 licenses support – the new generation of CC licenses offer improved global protection for your work, read more on their website.

Sorting of files within a folder – when you include a folder block on a page you can now choose how to sort the files, in either ascending or descending order.

If you have any questions about the upgrade please email ele@ucl.ac.uk and we would be happy to answer your questions or address your concerns.

All times are for the UK (GMT or BST), for other locations please convert: http://www.timeanddate.com/worldclock/converter.html

The first Electronic Bluebook exam runs at UCL Qatar

By Jessica Gramp, on 15 May 2014

The first electronic examination utilising the Electronic Bluebook secure software program ran yesterday at UCL Qatar.

The three hour exam was attended by the class of nine students who all chose to complete their exam electronically using the computer, rather than writing their answers by hand. Students were provided with instructions beforehand and attended a short briefing explaining how the system worked and were able to ask questions about the electronic format immediately prior to the start of the exam.

Electronic Bluebook

The software was launched in “blocked” mode, meaning no other software could be launched for the duration of the exam, apart from the secure examination system itself. That means students could not access the Internet,  any files on the computer or other programs like the Windows calculator.

blocked mode

Staff launched the program (which required Windows Administrator access) and chose the BLOCKED mode and students then entered their candidate numbers,  chose their module from the drop down list and selected the number of questions they were answering.

Students were spaced with a spare computer between them and the next candidate where possible. Partitions sat between each desk and staff confirmed in earlier tests that text could not be read by neighbouring students,  even when the text was zoomed to the maximum size.

Once the exam questions were handed out, students were permitted to turn over and read the questions, then start the examination software by clicking Start Exam and confirming the number of questions they were to answer.

Students were asked to type around 3,000 words during the exam in answer to 3 questions chosen by each student from a total of 9 questions. The word count for each tab was visible in the left hand column of the software to help students manage the time they spent on each question.

A large electronic countdown timer was displayed on monitors in 3 corners of the room and students were asked to disregard the timer shown in the software, since they were all slightly different depending on when students stopped reading the questions and clicked the [Start Exam] button.

Timings were announced verbally to students with 1 hour remaining, half an hour remaining (at which point students were no longer permitted to leave the room) and these times were also written on a flip chart for students to reference.

The interface of the software is similar to that of a simple text editor like Windows Notepad. Students can cut and paste (but not copy) to move elements of their text around. The software saves each student’s work every minute, both locally and to the exam server on the network. This means in the event of a computer failure the exam can be retrieved either locally or from the server and the student can continue with the exam,  with additional time granted to compensate for the disruption.

In the event of a power failure, or another event that prevented the entire cohort from completing the exam electronically, students would have been permitted to continue by hand writing on paper, with special consideration granted to them not being able to reference their work to that point, although attempts would have been made by the staff to print the electronic element of their answers and provide these to the students as quickly as possible to reference and modify by hand.

Students were asked to indicate the question they were about to answer at the top of each tab. Each question was answered on a new tab. Students were provided a notepad and pen for taking handwritten notes during the exam and were also permitted to write on the question sheet. All of these materials remained in the room after the exam. The exam adhered as closely as possible to traditional handwritten examination procedures as per UCL’s e-examination guidance.

One student asked whether they could write notes electronically within the software and were advised that notes should be deleted before submitting, as everything remaining within the exam script would be considered part of their answer and marked accordingly.

During the exam students had some questions about how to begin their exam, how to reference the question they were answering within each tab and how to save and move to the next question, which were all quietly answered by the invigilators. One student was unfamiliar with the UK keyboard layout and needed help locating the quotation marks.

Some of the benefits expected by the staff marking the exam scripts included better legibility of answers, compared to handwritten exams, and less strain on students’ hands given they are no longer used to writing for extended periods. Another expected benefit was the possibility of being able to move text around within each answer.

After the exam, the majority of students reported that they liked being able to type their answers, and that this was faster and more effective than hand-writing. Two students said they would have preferred to handwrite the exam in future, as they felt they were faster at handwriting and could therefore write more in the time and one student was unsure which she preferred. A survey will be used to gather further feedback from these students.

As an observer I noted the noise in the room of tapping keyboards was less distracting than I expected, although there were a small number of students in this case. Some students hand wrote notes and others opted to type directly into the software. I observed many of the students taking advantage of the ability to readily edit their previous writing. Most students appeared to type their answers out directly as they would likely do in a handwritten exam, although I noted one student who appeared to plan the answer in summary form on the computer first and then filled in the details later.

At the end of the exam, students were asked to stop typing and click the FINISH EXAM button and confirm. The system then was expected to send their encrypted responses to the exam server and each screen turn bright green to indicate the exam had been successfully submitted. Although extensive testing was undertaken prior to the exam, there was an, as yet undiagnosed, network failure on the day, which prevented the exam scripts from automatically submitting to the exam server. Despite this issue, the software failed gracefully by providing a descriptive error message (on a blue screen) and explaining the following process. The default web browser was launched automatically upon hitting the [CANCEL] button and loaded a page describing how to locate the encrypted exam script file. The file was then able to be manually selected and uploaded to the examination server. As a precaution, each encrypted exam script was also saved to a USB stick.

The exam scripts were then retrieved by the technical team, then decrypted and printed for marking using the Electronic Bluebook decryption tool.

Overall I was impressed with the simple interface, customer support received and relatively straight forward technical implementation of this secure exam system.

The full size screen, ability to change background colours for visually impaired and dyslexic students (via built-in Windows accessibility tools) and the ability to zoom text to make it larger means it is widely accessible to the majority. However, further tests will need to be carried out to see whether the software is compatible with screen readers and other enabling technologies.

Small improvements that I would like to see implemented in future versions include an option to turn off the system’s timer; having the number of questions pre-populate for each exam (with the option to override the number perhaps); the ability to allow particular, pre-defined programs to run (e.g. the Windows calculator, or Excel); and having a drop down option on each question tab, so a student can select which question they are answering i.e. tab 1 answering question 7.

Overall I think this tool is intuitive, accessible and simple enough to be used effectively by students to complete essay style examinations electronically and I was especially impressed by the graceful way it failed when it encountered network connectivity issues.

New UCL Moodle baseline

By Jessica Gramp, on 12 November 2013

MoodleThe UCL Moodle Baseline that was approved by Academic Committee in June 2009, has now been updated after wide consultation on best current UCL practice.  The aim of the Baseline is to provide guidelines for staff to follow when developing Moodle courses in order for UCL students to have a consistently good e-learning experience. They are intended to be advisory rather than prescriptive or restrictive. These recommendations may be covered within a combination of module, programme and departmental courses.

Changes include the addition of a course usage statement explaining how students are expected to use their Moodle course. A communications statement is also now a requirement, in order to explain to students how they are expected to communicate with staff, and how often they can expect staff to respond. It is now a recommendation for staff to add (and encourage their students to add) a profile photograph or unique image, to make it easier to identify contributors in forums and other learning activities.

New guidelines for including assessment detail and Turnitin guidance have been added for those who use these technologies.

See the new UCL Moodle Baseline v2

Find out more about this and other e-learning news in the monthly UCL E-Learning Champions’ Newsletter.

Jots from BETT 2013

By Mira Vogel, on 11 February 2013

BETT is a gargantuan annual learning technology trade show.

With digital literacies in mind I spent a few minutes at the Lynda stand – Lynda is a library of short, focused courses for different technologies and practices. Did they offer courses for open source applications? GIMP as well as Photoshop? Yes. OpenOffice as well as Microsoft Office? Yes. Audacity? Yes – and these are just a sample. Nice one.

Avoiding lock-in is a tic of mine to do with fears of obsolescence and conservatism. Adaptable, generative technologies have most appeal for me – smartboards that work as ordinary whiteboards and projection surfaces, voting handsets which allow natural language. Narrower business models are prone to fail or be overtaken. For example I spent an interesting ten minutes looking at a some audio note-taking technology aimed at students with dyslexia and/or English as a second or other language. It was explained to me as primarily designed to work with PowerPoint. It responds to natural pauses in speech to visualise talk as strips which a student can attach to a lecturer’s slides, adding colour codes for future reference. The audio annotation looked very helpful but I wondered about the orientation to slidewear – without a concept map slides often fail as representations of complicated subjects (this is an important capability of concept-mapping presentation software like Prezi or Sozi). Along with seemingly taking slideware for granted, this software anticipates the kind of didactic real-time lectures which may be prevalent now but are increasingly challenged by lecture-flipping pioneers – it depends on one person speaking to a silent audience and would struggle to handle the ambient hubbub of several discussion groups. The niche for this technology is shrinking and I wondered if it could easily pivot to realign itself.

That said, the lecture remains a feature of higher education institutions. Depending on how it’s conceived, it can be an event where students are required to turn up together in person – but then listen in isolation to a presentation where their intervention and contribution, if invited at all, requires unusual levels of self-confidence. Alternatively, a large in-person group can be an opportunity for contact and exchange. While there are plenty of low-tech opportunities for the latter, there are huge benefits to involving students’ own devices. Examples include persuading reticent or self-conscious students to ask questions or contribute ideas, and electronic voting which allows on the fly visualisation and tutor response. I think it’s worth trying to incorporate the wealth of technologies students own, on the basis that students actually have them on their person, they look after them carefully, and can operate, bend and modify them as the inspiration takes them.

At the same time, students (and staff) who do not own technologies should not be disadvantaged. It’s true that resistance on students’ part to institutional exploitation of their personal technologies is strong and understandable – but  it may be worth investigating this desire for separation. If there’s a perception that an institution is imposing a hidden cost, might this be assuaged by offering plentiful power supply, for example, and avoiding appropriating students’ free SMS allowances? User groups exploring some of the less obvious applications, efficiencies and other benefits of different smart-phone models, helping each other upgrade and so on would be another avenue. And thinking about wear and tear, a stand at BETT – the only one of its kind, which is pretty telling – was called MendIT. If institutions could offer students coverage for speedy repairs, a good deal on replacement batteries, memory upgrades and so on, for the duration of their course, then the arrangement is more reciprocal.

Due to meetings back at work I think I missed most of the best presentations from BETT’s Higher Education Conference and LiveLearn. I was sorry to have to duck out of Sarah Sherman on the celebrated cooperation between the e-learning specialists in the smaller specialist University of London colleges which comprise the Bloomsbury Learning Environment, who on their own would be spread unbearably thin. There was a utopian 25 minutes from IBM about the benefits of wrap-around student monitoring which prompted the chair Claire Bolderson to raise the question of ethics. Relatedly there was one I was particularly glad to have caught – Simon Buckingham-Shum (the Open University’s Knowledge Media Institute) on Learning analytics: unlocking student data for 21st century learning? Learning analytics is predictive modeling applied to student data (e.g. on things like assessment, attendance, and social integration, from online learning environments, student record systems and other institutional services) to flag students who “may not be performing to the best of their abilities“. The future of Coursera and EdX are predicated on turning these kinds of data into patterns for course and activity design, and for this reason I’d say that a turn to analytics to appraise and reform institutional practices, not least teaching, is in the offing. The implications are quite momentous so it’s worth taking notice and involving staff and students from the start.