Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

  • Subscribe to the Digital Education blog

  • Meta

  • Tags

  • A A A

    Innovating Pedagogy 2016 report

    By Clive Young, on 2 December 2016

    ip2016Innovating Pedagogy 2016 is the fifth annual report from the Open University (this year in collaboration with the Learning Sciences Lab at the National Institute of Education, Singapore) highlighting new forms of teaching, learning and assessment with an aim to “guide educators and policy makers”.

    The report proposes ten innovations that are “already in currency but have not yet had a profound influence on education”. In other words they are at an early phase of the Gartner Hype Cycle. Whether any will become, in the current idiom, ‘normalised’ remains to be seen and some scepticism would be advised. However, as I noted when the 2015 version was published, such reports often frame the discussion around technology in education, even if initially only at the level of “buzz-word bingo” for enthusiasts.

    The current list “in an approximate order of immediacy and timescale to widespread implementation” is;

    • Learning through social media – Using social media to offer long-term learning opportunities
    • Productive failure – Drawing on experience to gain deeper understanding
    • Teachback – Learning by explaining what we have been taught
    • Design thinking – Applying design methods in order to solve problems
    • Learning from the crowd – Using the public as a source of knowledge and opinion
    • Learning through video games – Making learning fun, interactive and stimulating
    • Formative analytics – Developing analytics that help learners to reflect and improve
    • Learning for the future – Preparing students for work and life in an unpredictable future
    • Translanguaging – Enriching learning through the use of multiple languages
    • Blockchain for learning – Storing, validating and trading educational reputation

    The usual fascinating mix of familiar ideas with novel concepts, the report gives a quick overview of why these may be important and includes handy links to further reading if you are interested

    An exciting start of a new academic year!

    By Alan Y Seatwo, on 2 December 2016

    STEaPP ABC Workshop

    STEaPP ABC Workshop

    The Department of Science Technology Engineering and Public Policy (STEaPP) kicks off new academic year with ABC (Arena, Blended, Connected) Workshop, one of the panned activities to enhance the use of learning technology across our teaching programmes.

    In the department, our teaching philosophy is that the use of technology in the learning experience must be driven by pedagogical considerations, and not the demands and availability of the various technologies themselves. To enhance the use of learning technologies, we must first reflect upon curriculum design.

    With the support from the Digital Education Team, teaching staff from the MPA programme attended the ABC (Arena Blended Connected) curriculum design workshop. We used paper card-based approach in a style of storyboarding to assist participants to reflect on structure, modes of delivery, learning outcomes and assessment methods etc.

    Our colleagues loved the simplicity of the approach and the effectiveness of the workshop model. Although learning technologies were not explicitly ‘called out’, it was firmly embedded in all six common types of learning activities during the exercise: acquisition inquiry, practice, production, discussion and collaboration.

    Following on the workshop, Dr Ann Thorpe (the department’s E-learning Champion) and I set up a series of meetings with individual module leaders to further explore the use of learning technologies in their teaching programm.

    We have been excited to see that the consideration of how technologies can enhance learning has already embedded in their design processes, for example producing videos as part of “Flipped Classroom”, streaming guest speakers to present and engage with students in classroom, and use of audio assessment feedback are some of the ideas currently developing following the workshop.

    While providing continuing support for the above mentioned activities, we’re also scheduling some bespoke workshops throughout the academic year. Since the department leads the organisation of How to Change the World (HtCtW) (part of UCL Global Citizenship Programme for undergraduate engineering students), we are interested to explore new ways in presenting engineering ideas during HtCtW. Augmented Reality (AR) has been identified as one of the emerging learning technologies over the past few years and the popularity of Pokemon Go have helped influence us to choose AR as our first lunchtime workshop topic. Watch this space for an update report soon.

    Alan Seatwo

    Learning Technologist, STEaPP

    UCL’s new HEFCE-funded curriculum enhancement project

    By Clive Young, on 1 December 2016

    natasaFollowing our successful bid to the HEFCE Catalyst Fund, which aims to drive innovation in the higher education sector, Digital Education and CALT launch a new project today called UCL Action for Curriculum Enhancement (ACE).

    UCL ACE is one of 67 new HEFCE-funded projects which will develop and evaluate small-scale, experimental innovations with specific cohorts of learners and will run for a period of 18 months.

    The project links to our commitment in the UCL Education Strategy 2016-21 to the development and implementation of the Connected Curriculum and the ABC learning design process. It aims to develop and evaluate UCL’s innovative rapid-development approaches to blended curriculum design, which focus on a framework for research-based education (Connected Curriculum) in order to make a curriculum development pack available to all HEIs interested in improving programme design and engaging students in research-based learning.

    The project will evaluate the impact of our ABC rapid-development approaches to programme development on student outcomes and experience via case studies, produce an online and downloadable pack which can be adapted and used by any higher education institution and establish a supportive community of practice around its implementation.  

    Across UCL programmes of study are being re-designed and developed to engage students much more actively in enquiry-based learning with the Connected Curriculum (CC) framework introduced to facilitate these changes. In parallel we have seen growing use of digital resources and approaches to support new modes of study such as blended learning.

    UCL aims are to ensure that educational intentions, outcomes, activities and assessments are aligned to form a cohesive, connected and effective learning experience for our students, and that programmes of study enable students to connect more effectively with researchers, with the workplace, with each other, and with local and wider communities.

    However we recognise planning rich and complex learning environments requires a structured, dialogic approach to effecting change in programme and module design. UCL has therefore piloted an integrated set of ‘light touch’ but focused learning design approaches, including workshops, CC guides, digital benchmarks and online support.

    One key component is ABC, our effective and engaging hands-on workshop trialled with great success over a range of programmes. In just 90 minutes using a game format teams work together to create a visual ‘storyboard’ outlining the type and sequence of learning activities and assessment and feedback opportunities (both online and offline) required to meet the module’s learning outcomes. ABC is particularly useful for new programmes or those changing to an online or more blended format. This approach generates high levels of engagement, creative informed dialogue and group reflection about curriculum design among even time-poor academics. This is a highly transferrable methodology already trailed at Glasgow and Aarhus (DK) Universities. There are versions in Spanish and Dutch following other workshops run in Chile and Belgium.

    In addition, we are introducing workshops to enable programme leaders and teams to work with students to benchmark their programmes in line with the descriptors of the Connected Curriculum framework, using a published Guide.

    For this project, we aim to continue to deliver this range of dialogic workshops but track their effects and impacts carefully, using a combination of focus groups (with staff and with students), individual semi-structured interviews with key stakeholders, and analysis of programme-level and module-level metrics. We will use this focused analysis to develop a resource pack to enable these developmental activities to be scaled up, both with and beyond UCL.

    Clive Young (UCL Digital Education), will lead the project team which will include ABC co-developer Natasa Perovic (UCL Digital Education) and CALT colleagues.

    HEFCE Press release HEFCE supports experimental innovation in learning and teaching

    6 top tips to help you build your Twitter following

    By Jessica Gramp, on 14 November 2016

    Last week as part of the UCL Doctoral Schools’ Digital Identity and Scholarship course, Jessica Gramp from the Digital Education team ran a Tweet for a Week activity to help staff learn to use Twitter (see #ucldias). One of the questions asked by the participants was how to build a strong Twitter following.

    Here are 6 top tips to help you build your Twitter following:

    1. TweetUpload a picture and fill in your Twitter bio with a bit about yourself – a mix of professional and personal interests is usual. You might link these to hashtags.
    2. Follow those with interests similar to your own.
    3. Use hashtags to attract more followers.
    4. Link to your Twitter from your other networks. E.g. LinkedIn, Facebook, email signature, business cards, websites.
    5. Tweet media, such as video and images.
    6. Track your most popular tweets using Twitter Analytics.

    See 10 ways to build a large, quality Twitter following…

     

    Have you got questions, ideas or experience here?

    If so, please do share them, either via the Twitter hashtag #elearningUCL or (for UCL staff and students) via the UCL Moodle Users forum.

    Comparing Moodle Assignment and Turnitin for assessment criteria and feedback

    By Mira Vogel, on 8 November 2016

    Elodie Douarin (Lecturer in Economics, UCL School of Slavonic and Eastern European Studies) and I have been comparing how assessment criteria can be presented to engage a large cohort of students with feedback in Moodle Assignment and Turnitin Assignment (report now available). We took a mixed methods approach using questionnaire, focus group and student screencasts as they accessed their feedback and responded to our question prompts. Here are some our key findings.

    Spoiler – we didn’t get a clear steer over which technology is (currently) better – they have different advantages. Students said Moodle seemed “better-made” (which I take to relate to theming issues rather than software architecture ones) while the tutor appreciated the expanded range of feedback available in Moodle 3.1.

    Assessment criteria

    • Students need an opportunity to discuss, and ideally practice with, the criteria in advance, so that they and the assessors can reach a shared view of the standards by which their work will be assessed.
    • Students need to know that criteria exist and be supported to use them. Moodle Assignment is good for making rubrics salient, whereas Turnitin requires students to know to click an icon.
    • Students need support to benchmark their own work to the criteria. Moodle or Turnitin rubrics allow assessors to indicate which levels students have achieved. Moreover, Moodle allows a summary comment for each criterion.
    • Since students doubt that assessors refer to the criteria during marking, it is important to make the educational case for criteria (i.e. beyond grading) as a way of reaching a shared understanding about standards, for giving and receiving feedback, and for self/peer assessment.

    Feedback

    • The feedback comments most valued by students explain the issue, make links with the assessment criteria, and include advice about what students should do next.
    • Giving feedback digitally is legible and easily accessible from any web connected device.
    • Every mode of feedback should be conspicuously communicated to students and suggestions on how to cross-reference these different modes should be provided. Some thoughts should be given to ways to facilitate access to and interpretation of all the elements of feedback provided.
    • Students need to know that digital feedback exists and how to access it. A slideshow of screenshots would allow tutors to hide and unhide slides depending on which feedback aspects they are using.

    Effort

    • The more feedback is dispersed between different modes, the more effortful it is for students to relate it to their own work and thinking. Where more than one mode is used, there is a need to distinguish between the purpose and content of each kind of feedback, signpost their relationships, and communicate this to students. Turnitin offers some support for cross referencing between bubble comments and criteria.
    • It would be possible to ask students to indicate on their work which mode (out of a choice of possibilities) they would like assessors to use.
    • The submission of formative assessment produced with minimal effort may impose a disproportionate burden on markers, who are likely to be commenting on mistakes that students could have corrected easily by themselves. Shorter formative assessment, group works, clearer statements of the benefits of submitting formative work may all help limiting the incidence of low-effort submissions.
    • If individual summary comments have a lot in common, consider releasing them as general feedback for the cohort, spending the saved time on more student-specific comments instead. However, this needs to be signposted clearly to help students cross-reference with their individual feedback.
    • As a group, teaching teams can organise a hands-on session with Digital Education to explore Moodle Assignment and Turnitin from the perspectives of students, markers and administrators. This exposure will help immeasurably with designing efficient, considerate processes and workflows.
    • The kind of ‘community work’ referred to by Bloxham and colleagues (2015) would be an opportunity to reach shared understandings of the roles of students and markers with respect to criteria and feedback, which would in turn help to build confidence in the assessment process.

     

    Bloxham, S., den-Outer, B., Hudson, J., Price, M., 2015. Let’s stop the pretence of consistent marking: exploring the multiple limitations of assessment criteria. Assessment & Evaluation in Higher Education 1–16. doi:10.1080/02602938.2015.1024607

     

    8th Jisc Learning Analytics Network

    By Stephen Rowett, on 7 November 2016

    The Open University was the venue for the 8th Jisc Learning Analytics Network. I’d not been there before. It was slightly eerie to see what was a clearly reconigsable university campus but without the exciting if slightly claustrophic atmosphere that thousands of students provide. I won’t report on everything, but will give some highlights most relevant to me. There’s more from Niall Sclater on the Jisc Learning Analytics blog.

    The day kicked off with Paul Bailey and Michael Webb giving an update on Jisc’s progress. Referring back to their earlier aims they commented that things were going pretty much to plan, but the term ‘learner monitoring’ has thankfully been discarded. Their early work on legal and ethical issues set the tone carefully and has been a solid base.

    Perhaps more clearly than I’ve seen before, Jisc have set their goal as nothing less than sector transformation. By collecting and analysing data across the sector they believe they can gain insights that no one institution could alone. Jisc will provide the central infrastructure including a powerful learning records warehouse, along with some standardised data transformation tools, to provide basic predictive and alerts functionality. They will also manage a procurement framework for insitutions who want more sophistication.

    The learning records warehouse is a biggie here – currently with 12 institutions on board and around 200 million lines of activity. Both Moodle and Blackboard have plug-ins to feed live data in, and code for mainpulating historic data into the right formats for it.

    Paul and Michael launched a new on-boarding guide for institutions at https://analytics.jiscinvolve.org/wp/on-boarding – A 20 step checklist to getting ready for learning analytics. Step 1 is pretty easy though, so anyone can get started!

    Bart Rientes from the Open University showed again how important learning analytics is to them and how powerfully they can use it. Mapping all of the activities students undertake into seven different categories (assimilative, finding and handling information, communication, productive, experiential, interactive/adaptive, assessment) gives dashboards allowing course designers to visualise their courses. Add in opportunities for workshops and discussion and you have a great way of encouraging thinking about course design.

    Interestingly, Bart reported that there was no correlation between retentition and satisfaction. Happy students fail and unhappy students pass, and vice versa. Which begs the question – do we design courses for maximum retention, or for maximum satisfaction, because we can’t have both!

    Andrew Cormack, Chief Regulatory Advisor at Jisc, gave an update on legal horizons. The new General Data Protection Regulations is already on the statute books in the UK but comes into force on 1 May 2018. For a complex issue, his presentation was wonderfully straightforward. I shall try to explain more, but you can read Andrew’s own version at http://www.learning-analytics.info/journals/index.php/JLA/article/view/4554   [I am not a lawyer, so please do your own due diligence].

    Much of the change in this new legislation involves the role of consent, which is downplayed somewhat in favour of accountability. This gives logic thus:

    • We have a VLE that collects lots of data for its primary purpose – providing staff and students with teaching and learning activities.
    • We have a secondary purpose for this data which is improving our education design, helping and supporting learners and we make these explicit upfront. We might also say any things that we won’t do, such as selling the data to third parties.
    • We must balance any legitimate interest they have in using the data collected, against any risks of using the data that the data subject might face. But note that this risk does not need to be zero in order for us to go ahead.
    • Andrew distinguished between Improvements (that which is general and impersonal, e.g. the way a course is designed or when we schedule classes) and Interventions (which go to an individual student to suggest a change in behaviour). The latter needs informal consent, the former can be based on legitimate interest. He also suggested that consent is better asked later in the day, when you know the precise purpose for the consent.
    • So for example in a learning analytics project, we might only obtain consent at the first point where we intervene with a given student. This might be an email which invites them to discuss their progress with the institution, and the act of the doing so gives consent at the same time.

    You can follow Andrew as @Janet_LegReg if you want to keep up with the latest info.

    Thanks to Jisc for another really good event, and apologies to those I haven’t written about – there was a lot to take in!