X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Archive for the 'Case Studies' Category

ABC has reached 21

By Natasa Perovic, on 24 March 2016

(For latest news about ABC LD, visit ABC LD blog)

Digital Education has now run 21 of our popular rapid learning design workshops. ABC uses an effective and engaging paper card-based method in a 90 minute hands-on workshop. It is based on research from the JISC and UCL IoE and over the last year has helped 70 module and course teams design and sequence engaging learning activities. It has proved particularly useful for new programmes or those changing to an online or more blended format.

To find out if ABC is for you this short video captured one of our workshops earlier this year.

Participants feedback remains encouragingly  positive 

“I thought the ABC session was really helpful.  I had been a little unsure ahead of the session what it would achieve – but I genuinely got a lot from it.  Going back to the basics of methods etc really helped focus on the structure and balance of the module.  I thought the output was very useful.”

“Thank you for convening the abc workshop today, i  found it thought provoking and challenged the way we think about our teaching. It is too easy to stick to what we have done previously and I found today gave me different ways to think about how to evaluate our current teaching and to bring in different approaches. It will definitely improve my thinking and I will continue with the approach to incorporate some of the ideas into the modules.”

“Thank you for the workshop today- it was an eye opener. I found it really useful to think about categorising how the learning objectives will be delivered and assessed, and examining the variety of ways that these can be achieved. It made me think more deeply about what skills the students can develop by making them responsible for their learning journey and not simply the content that needs to be delivered to them. We will let you know how it goes!”

“It was great and many initiatives have emerged from it.”

abc workshop group work

For questions and workshops contact Clive and Nataša

cy_np

 

 

 

For more information see :

ABC Curriculum Design 2015 Summary
https://blogs.ucl.ac.uk/digital-education/2015/12/02/abc-curriculum-design-2015-summary/

ABC workshop resources and participants’ feedback https://blogs.ucl.ac.uk/digital-education/2015/09/30/9169/

ABC beginnings https://blogs.ucl.ac.uk/digital-education/2015/04/09/abc-arena-blended-connected-curriculum-design/

 

ABC News:

We are currently developing an online toolkit to support the workshop, have been working closely with CALT to embed the Connected Curriculum in designs and we are developing collaboration projects with The University of Glasgow, Aarhus University (Denmark), University of Leiden (Netherland) and Universidad Adolfo Ibáñez (Chile) in order to look at the learning impact of this method. Our colleagues in Chile are even translating the workshop into Spanish.

ABC also featured on UCL Teaching and Learning portal as a case study: Designing programmes and modules with ABC curriculum design http://www.ucl.ac.uk/teaching-learning/case-studies-news/e-learning/designing-abc-curriculum-design

UCL lecturers on video

By Clive Young, on 10 September 2015

ema-naval-lovAndrewCookcasestudy

Once confined to a few teaching enthusiasts and specific disciplines, over the last decade video, audio and interactive media have become an increasingly mainstream part of UCL’s academic repertoire.

Media has definitely become part of many of our students’ study processes.

Students consistently report that video content assists their learning, either as a revision tool or as a new way of engaging with material. Student demand for example has largely driven the growth of lecture capture. More broadly the success of Khan Academy video-based MOOCs and especially at UCL Lynda.com has helped digital video become recognised as a means to support high-quality academic learning. Key to this is integration with Moodle enabling any media to be enhanced by other online resources and support.

Media itself has become easier and cheaper to produce, edit, store and deliver, enabling both our academics and students to become producers with ‘media literacy’ is widely becoming identified as a valuable education and research asset.

Tony Slade and Clive Young from the ISD Learning, Teaching & Media Services team have been working on a project this year to develop a UCL Educational Media service. The research project investigates how and why lecturers use video and what their future video requirements are for successful student teaching. Interviews have been compiled with staff project examples to form case studies. An education producer, Mike Howarth was commissioned to produce the content for the research project

The team has have found widespread use of media to change the way we design programmes. Media seems to act as a catalyst enabling new blends of virtual learning and conventional delivery to create rich media and face-to-face learning experiences. ‘Flipping’ is also increasingly considered at UCL as a way to maximise the educational opportunity of face-to-face learning.

For examples of these ideas, follow the links below to six short video case studies on UCL’s T&L Portal.

As a bonus if you are asking yourself “Can using free online video tutorials through lynda.com enhance my teaching?” try this additional case study.

Students’ intellectual property, open nitty gritty

By Mira Vogel, on 19 May 2015

Brass tacks by MicroAssist on FlickrWhat happened when staff on one module encouraged students to openly license the online products of their assessed group work?

Object Lessons is a module on Bachelor of Arts and Sciences at UCL. In keeping with its object-based nature and emphasis on inquiry and collaboration, part of the assessment is a group research project to produce a media-rich online exhibition. Because the exhibitions are lovely and shine a light on multimodal assessment, the teaching team are frequently approached by colleagues across UCL with requests to view them. In considering how to get students’ permission for this, Leonie Hannan (now at QUB), Helen Chatterjee and I quickly realised a few things. One, highlighted by an exchange with UCL’s Copyright specialist Chris Holland, was that the nature of the permission was hard to define and therefore hard to get consent for, so we needed to shift the emphasis away from staff and the nuances of their possible use scenarios, and onto the status of the work itself. Another was that since the work was the product of a group and could not be decomposed into individual contributions without breaking the whole, consent would need to be unanimous. Then there was the question of administrative overhead related to obtaining consent and actually implementing what students had consented to – potentially quite onerous. And finally the matter presented us with some opportunities we shouldn’t miss, namely to model taking intellectual property seriously and to engage students in key questions about contemporary practices.

We came up with four alternative ways for students to license their work ranging incrementally from open to private. We called these:

1. Open;
2. Publish;
3. Show;
4. Private.

You can read definitions of each alternative in the document ‘Your groupwork project – requesting consent for future use scenarios’ which we produced to introduce them to students. As part of their work students were required to discuss these, reach a unanimous consensus on one, and implement it by publishing (or selectively, or not at all) the exhibition and providing an intellectual property notice on its front page. That way staff would not have to collect consent forms nor gate-keep access.

Before we released it to students I circulated the guidance to two Jiscmail discussion groups (Open Educational Resources and Association for Learning Technology) and worked in some of their suggestions. A requirement that students include a statement within the work itself reduces the administrative overhead and, we hoped, would be more future-proof than staff collecting, checking off and filing paper records. While making it clear that students would not be at any deficit if they chose not to open their work, we also took a clear position in favour of Creative Commons licensing – the most open of our alternatives, since as well as flexibility and convenience it would potentially lend the work more discoverability and exposure.

What did the students choose? In the first iteration, out of ten groups:

  • Five opted for Open. Between them they used 3 different varieties of Creative Commons licence, and one submitted their work to Jorum;
  • Two opted for Publish;
  • None opted for Show;
  • Three opted for Private (including one which didn’t make a statement; since the group kept the work hidden this defaults to Private).

We haven’t yet approached the students to ask about their decision-making processes, but from informal conversations and reading some of the intellectual property statements we know that there are different reasons why half the students decided not to make their work open. One was the presence of elements which were not themselves open, and therefore could not be opened in turn. From evaluations of a number of other modules, we know that the students were not generally all that enthusiastic about the platform they were asked to use for their exhibition (Mahara, which is serviceable but vanishingly rare outside educational settings). This may have contributed to another factor, which was that not all group members felt the work reflected well on them individually.

Then there’s the matter of deciding to revoke consent, which is something individual students can do at any time. In the context of group work we decided that what this would mean is that if any group member decides at a later date that they want to reduce openness, then this effectively overrides other group members’ preferences. That doesn’t work in reverse though – a student can’t increase openness without the consent of all other group members. So here we are privileging individuals who want to close work, although we do encourage them to consider instead simply ending their association with it. We have yet to find out how this state of affairs works out, and it may take quite a while to find out. But so far it seems stable and viable.

We would be very interested in your views, suggestions and any experiences you have had with this kind of thing – please do comment below.

Particular thanks to Pat Lockley and Javiera Atenas for their input.

Image source: MicroAssist, 2012. Brass tacks. Work found at https://www.flickr.com/photos/microassist/7136725313/. Licensed as CC BY-SA.

A good peer review experience with Moodle Workshop

By Mira Vogel, on 18 March 2015

Update Dec 2015: there are now three posts on our refinements to this peer feedback activity: one, two, and three.

Readers have been begging for news of how it went with the Moodle Workshop activity from this post.

Workshop is an activity in Moodle which allows staff to set up a peer assessment or (in our case) peer review. Workshop collects student work, automatically allocates reviewers, allows the review to be scaffolded with questions, imposes deadlines on the submission and assessment phase, provides a dashboard so staff can follow progress, and allows staff to assess the reviews/assessments as well as the submissions.

However, except for some intrepid pioneers, it is almost never seen in the wild.

The reason for that is partly to do with daunting number and nature of the settings – there are several pitfalls to avoid which aren’t obvious on first pass – but also the fact that because it is a process you can’t easily see a demo and running a test instance is pretty time consuming. If people try once and it doesn’t work well they rarely try again.

Well look no further – CALT and ELE have it working well now and can support you with your own peer review.

What happened?

Students on the UCL Arena Teaching Associate Programme reviewed each others’ case studies. 22 then completed a short evaluation questionnaire in which they rated their experience of giving and receiving feedback on a five-point scale and commented on their responses. The students were from two groups with different tutors running the peer review activity. A third group leader chose to run the peer review on Moodle Forum since it would allow students to easily see each others’ case studies and feedback.

The students reported that giving feedback went well (21 respondents):

Pie chart - giving feedback

Satisfaction with reviewing work – click to enlarge

This indicates that the measures we took – see previous post – to address disorientation and participation were successful. In particular we were better aware of where the description, instructions for submission, instructions for assessment, and concluding comments would display, and put the relevant information into each.

Receiving feedback also went well (22 respondents) though with a slightly bigger spread in both directions:

Pie chart - receiving feedback

Satisfaction with receiving reviews – click to enlarge

 

Students appreciated:

  • Feedback on their work.
  • Insights about their own work from considering others’ work.
  • Being able to edit their submission in advance of the deadline.
  • The improved instructions letting them know what to do, when and where.

Staff appreciated:

This hasn’t been formally evaluated, but from informal conversations I know that the two group leaders appreciate Moodle taking on the grunt work of allocation. However, this depends on setting a hard deadline with no late submissions (otherwise staff have to keep checking for late submissions and allocating those manually) and one of the leaders was less comfortable with this than the other. Neither found it too onerous to write diary notes to send reminders and alerts to students to move the activity along – in any case this manual messaging will hopefully become unnecessary with the arrival of Moodle Events in the coming upgrade.

For next time:

  • Improve signposting from the Moodle course area front page, and maybe the title of the Workshop itself, so students know what to do and when.
  • Instructions: let students know how many reviews they are expected to do; let them know if they should expect variety in how the submissions display – in our case some were attachments while others were typed directly into Moodle (we may want to set attachments to zero); include word count guidance in the instructions for submission and assessment.
  • Consider including an example case study & review for reference (Workshop allows this).
  • Address the issue that, due to some non-participation during the Assessment phase, some students gave more feedback than they received.
  • We originally had a single comments field but will now structure the peer review with some questions aligned to the relevant parts of the criteria.
  • Decide about anonymity – should both submissions and reviews be anonymous, or one or the other, or neither? These can be configured via the Workshop’s Permissions. Let students know who can see what.
  • Also to consider – we could also change Permissions after it’s complete (or even while it’s running) to allow students to access the dashboard and see all the case studies and all the feedback.

Have you had a good experience with Moodle Workshop? What made it work for you?

Meet Jess, Jack, Stuart & Heather – realistic voices for free* download

By Jessica Gramp, on 3 March 2015

I have recently started listening to my books and papers, rather than reading them. This frees me up to do other things while I listen, such as cook, take a bath or do some tidying up. It also gives my eyes a well needed break from staring at a computer screen or paper.

As part of an online e-learning course I am helping to develop, I am using the TechDis Jess voice to provide audio files of the commentary, as an alternative to reading. I have had to tweak some of the text – for example, UCL needs to be written with spaces between each letter in order for Jess to pronounce each letter individually and I needed to add a hyphen to CMALT (C-MALT) for it to be pronounced correctly. But for the most part I can leave the text much as it is typed. I then run it through a free, open source software called Balabolka to produce an audio file that participants on the course can download and listen to.

TechDis Jess and other UK voices (including Scottish and Welsh options) are available from www.heacademy.ac.uk/jisc-techdis-voices.

Balabolka is available from: www.cross-plus-a.com/balabolka.htm.
Listen to a sample:

Listen on SoundCloud…

*Staff and learners studying at England’s UK and FE institutions can download the voices free of charge and those at Scottish and Welsh institutions can download local voices.

HEA Senior Fellowship Case Study Series: 4 – Researching learner interaction and engagement with in-class response systems

By Matt Jenner, on 15 August 2014

As a four-part series I am openly publishing my case studies previously submitted for my Senior Fellowship of the Higher Education Academy. I submitted my application in February 2014. If you’re interested in this professional recognition programme, please visit their webpages and look through the Professional Standards Framework (PSF). UCL runs an institutional model for fellowships called ARENA, your institution may run one too – speak to people!

Case Study 4 – Researching learner interaction and engagement with in-class response systems

In 2012 I conducted research, in parallel with my job at UCL, focusing on increasing student interaction and staff engagement of an in-class question and response system colloquially known as ‘clickers’. Evidence suggests clickers provide interaction opportunities to stimulate and engage learners[1] and have a benign or positive effect in student performance[2]. Clickers are popular across many disciplines, in particular the physical sciences, but there is a particularly low interest in medical sciences.

I wanted to directly address this shortcoming so I enlisted two academics in the UCL Medical School. I assimilated the current method of teaching, and the materials used (K1). From here we adapted a learning activity to align with the new tool being applied (A1). I underpinned the use of the technology with existing literature and the evidence of realigning the ‘sage on the stage’ to the ‘guide on the side’ [3](K2), which evidence suggests is an effective method for learning and teaching (K3, V3). I provided pre-lecture technical support to reduce technical barriers and was on-hand in the lecture to support as/when needed (A2). Questions were designed into the lectures and the clickers provide immediate feedback (A3). Staff react to clicker data with an approach called ‘contingent teaching’[4] where they dynamically respond to the answers/feedback provided (A3).

I designed evaluation questions for each lecture based on Bloom’s Taxonomy[5] for learners-based evaluation of the teaching approach and learning outcomes (A4). Questions were derived from categorising Bloom into three sub-categories; remember or understand, apply or analyse the topic and evaluate or create new knowledge (K5). When questioned, 74% of students agreed or strongly agreed that the clickers and the related teaching approach encouraged interaction and helped to achieve metacognitive learning (K5). I integrated these data with post-lecture interviews for the lecturers. Using this analysis, we designed next steps for future use and identified gaps and areas for improvement (A5).

I conducted evidence-based research and followed best practice around clickers to ensure inclusion was academically merited (V3). Measuring (and increasing) engagement within the traditional lecture was aiming to promote participation for learners (V2). It was understood that clickers do not directly enhance learning but can lead to higher-order learning. I used my understanding of the wider field of evidence to define their most appropriate use within the lectures (V1, V3).

By implementing a technology which was new to staff and guiding them with appropriate techniques known to increase interaction and engagement, I provided an evidence-informed approach which could be used to transform didactic content delivery into something more engaging. My research adds to a disproportionately small body of knowledge for clickers in medical education and the study overall was positive. Staff involved still use the clickers, the impact I measured plus the evidence collected, can be further used to promote clickers within UCL, the Medical School and beyond. It earned me a Distinction in my MSc Learning Technologies and furthered my ambition to make a lasting, positive difference to higher education.

(493 words)

HEA Professional Standards Framework links referenced in this case study:

Areas of Activity

  • A1 Design and plan learning activities and/or programmes of study
  • A2 Teach and/or support learning
  • A3 Assess and give feedback to learners
  • A4 Develop effective learning environments and approaches to student support and guidance
  • A5 Engage in continuing professional development in subjects/disciplines and their pedagogy, incorporating research, scholarship and the evaluation of professional practices

Core Knowledge

  • K1 The subject material
  • K2 Appropriate methods for teaching, learning and assessing in the subject area and at the level of the academic programme
  • K3 How students learn, both generally and within their subject/disciplinary area(s)
  • K5 Methods for evaluating the effectiveness of teaching

Professional Values

  • V1 Respect individual learners and diverse learning communities
  • V2 Promote participation in higher education and equality of opportunity for learners
  • V3 Use evidence-informed approaches and the outcomes from research, scholarship and continuing professional development


[1] Bligh, D.A., (2000). What’s the use of Lectures? London/San Francisco; Jossey-Bass

[2] http://w.lifescied.org/content/6/1/9.short

[3] King, A. (1993). From Sage on the Stage to Guide on the Side. College Teaching, Vol. 41, No. 1, p30- 35. Taylor & Francis Ltd.

[4] Beatty I. D., Gerace W. J., Leonard W. J. and Dufresne R. J., (2006). Designing effective questions for classroom response teaching, American Journal of Physics. Vol. 74, p31-39.

[5] Bloom B.S., (1956). Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.