X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

HEA Senior Fellowship Case Study Series: 4 – Researching learner interaction and engagement with in-class response systems

By Matt Jenner, on 15 August 2014

As a four-part series I am openly publishing my case studies previously submitted for my Senior Fellowship of the Higher Education Academy. I submitted my application in February 2014. If you’re interested in this professional recognition programme, please visit their webpages and look through the Professional Standards Framework (PSF). UCL runs an institutional model for fellowships called ARENA, your institution may run one too – speak to people!

Case Study 4 – Researching learner interaction and engagement with in-class response systems

In 2012 I conducted research, in parallel with my job at UCL, focusing on increasing student interaction and staff engagement of an in-class question and response system colloquially known as ‘clickers’. Evidence suggests clickers provide interaction opportunities to stimulate and engage learners[1] and have a benign or positive effect in student performance[2]. Clickers are popular across many disciplines, in particular the physical sciences, but there is a particularly low interest in medical sciences.

I wanted to directly address this shortcoming so I enlisted two academics in the UCL Medical School. I assimilated the current method of teaching, and the materials used (K1). From here we adapted a learning activity to align with the new tool being applied (A1). I underpinned the use of the technology with existing literature and the evidence of realigning the ‘sage on the stage’ to the ‘guide on the side’ [3](K2), which evidence suggests is an effective method for learning and teaching (K3, V3). I provided pre-lecture technical support to reduce technical barriers and was on-hand in the lecture to support as/when needed (A2). Questions were designed into the lectures and the clickers provide immediate feedback (A3). Staff react to clicker data with an approach called ‘contingent teaching’[4] where they dynamically respond to the answers/feedback provided (A3).

I designed evaluation questions for each lecture based on Bloom’s Taxonomy[5] for learners-based evaluation of the teaching approach and learning outcomes (A4). Questions were derived from categorising Bloom into three sub-categories; remember or understand, apply or analyse the topic and evaluate or create new knowledge (K5). When questioned, 74% of students agreed or strongly agreed that the clickers and the related teaching approach encouraged interaction and helped to achieve metacognitive learning (K5). I integrated these data with post-lecture interviews for the lecturers. Using this analysis, we designed next steps for future use and identified gaps and areas for improvement (A5).

I conducted evidence-based research and followed best practice around clickers to ensure inclusion was academically merited (V3). Measuring (and increasing) engagement within the traditional lecture was aiming to promote participation for learners (V2). It was understood that clickers do not directly enhance learning but can lead to higher-order learning. I used my understanding of the wider field of evidence to define their most appropriate use within the lectures (V1, V3).

By implementing a technology which was new to staff and guiding them with appropriate techniques known to increase interaction and engagement, I provided an evidence-informed approach which could be used to transform didactic content delivery into something more engaging. My research adds to a disproportionately small body of knowledge for clickers in medical education and the study overall was positive. Staff involved still use the clickers, the impact I measured plus the evidence collected, can be further used to promote clickers within UCL, the Medical School and beyond. It earned me a Distinction in my MSc Learning Technologies and furthered my ambition to make a lasting, positive difference to higher education.

(493 words)

HEA Professional Standards Framework links referenced in this case study:

Areas of Activity

  • A1 Design and plan learning activities and/or programmes of study
  • A2 Teach and/or support learning
  • A3 Assess and give feedback to learners
  • A4 Develop effective learning environments and approaches to student support and guidance
  • A5 Engage in continuing professional development in subjects/disciplines and their pedagogy, incorporating research, scholarship and the evaluation of professional practices

Core Knowledge

  • K1 The subject material
  • K2 Appropriate methods for teaching, learning and assessing in the subject area and at the level of the academic programme
  • K3 How students learn, both generally and within their subject/disciplinary area(s)
  • K5 Methods for evaluating the effectiveness of teaching

Professional Values

  • V1 Respect individual learners and diverse learning communities
  • V2 Promote participation in higher education and equality of opportunity for learners
  • V3 Use evidence-informed approaches and the outcomes from research, scholarship and continuing professional development


[1] Bligh, D.A., (2000). What’s the use of Lectures? London/San Francisco; Jossey-Bass

[2] http://w.lifescied.org/content/6/1/9.short

[3] King, A. (1993). From Sage on the Stage to Guide on the Side. College Teaching, Vol. 41, No. 1, p30- 35. Taylor & Francis Ltd.

[4] Beatty I. D., Gerace W. J., Leonard W. J. and Dufresne R. J., (2006). Designing effective questions for classroom response teaching, American Journal of Physics. Vol. 74, p31-39.

[5] Bloom B.S., (1956). Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.

Interactive lectures in Management Science & Innovation: A pilot evaluation study of LectureTools

By Vicki Dale, on 12 August 2013

Jane Britton and Matt Whyndham recently piloted LectureTools with a small group of 17 students in a short course in project management. LectureTools is a cloud-based electronic voting system which students and their teachers can access via their laptops or mobile devices. The system works by importing an existing PowerPoint presentation and then adding interactivity to it, through varied question formats. LectureTools allows students to take notes on their devices alongside each slide; they can also flag when they are confused about a particular slide, or submit questions, which will be displayed on the tutor ‘dashboard’ (see the screenshots below, click on each one to see a full size image).

 

LectureTools presenter screenshot

LectureTools presenter interface  (the ‘dashboard’), showing an activity slide at the top, student responses and comprehension on the right, and a panel displaying student questions on the middle left. A preview of the adjacent slides is shown at the bottom of the screen.

 

LectureTools student screenshot

LectureTools student interface, showing the PowerPoint slides on the left with the interactive options above and the note-taking area on the right.

 

As E-Learning Evaluation Specialist within ELE, I carried out an evaluation, gathering data from a range of sources. These included an observation of one of Jane’s interactive lectures and a student questionnaire followed by a focus group discussion with a sample of participants. Both educators were also interviewed at the end of the course. Students rated the system positively overall for stimulating their engagement in the subject, allowing them to measure their understanding, fostering discussion in the classroom and facilitating easy note-taking. In addition, they perceived that it helped them prepare for their forthcoming examination.  Student comments included:

“I liked the LectureTools a lot.  I’m really impressed by it. It’s so easy to use and so helpful and most of us nowadays work on computers anyway during the lecture so it just makes it easier not to write everything in Word, copy the slides, we have everything on one screen.”

“We haven’t really asked a question to a lecturer but I think that’s great, that you can write a question and then the lecturer looks there and then they can answer it.”

 

Both Jane and Matt felt it was helpful to know what students were thinking and to be able to provide timely feedback, although having a class of students all staring at their laptops at various points was initially disconcerting:

“I think I notice that you get a lot of heads down working all of a sudden and it looks very disconcerting at first … you need to just be aware that they are working and they’re thinking about your stuff but they’re not looking at your face.”

 

One potential issue that came out of the observation and the survey of students was the opportunity for distraction; this generally happened when students had typed in their responses to open questions and were waiting for other students to ‘catch up’:

“I do think that the multiple choice questions, or putting the order questions, those are very good ones because all of us answered relatively quickly … so we had no time for distractions but the written ones … when you don’t have anything to do you start to do other things.”

 

Learning activities need to be carefully structured in order to give students enough time and opportunities to think about their topic, but not so much that they use the laptop to access resources not related to their studies. For this reason, the students and Jane and Matt considered that closed questions such as multiple choice questions might be better than open questions for large lectures.

A working paper of this study will shortly be uploaded to UCL Discovery.

E-Learning Environments is working with other staff in various departments around the university to explore the potential of LectureTools to facilitate interactive lectures. If you would like more information or would like to pilot LectureTools or a comparable electronic voting system, please contact myself or Janina Dewitz, our Innovations Officer.

A new perspective on electronic voting

By Steve Rowett, on 12 June 2012

As part of UCL’s involvement in the Cheltenham Science Festival, someone from our team goes down to Cheltenham to support the use of electronic voting in some of the events there. My colleague Matt has already blogged about the kinds of things we do.

This year, one of the groups we are supporting is the Festival of the Spoken Nerd – Helen, Matt and Steve – who are doing a show in Cheltenham on Thursday 14 June.

Last night they did a try out of some of their material in the upstairs room at the Green Man in London. They certainly had a packed house, although seating only about 30 at a squeeze the venue is very cosy and the audience certainly get to interact with the performers.

Part of the performance was a re-make of Bruce Forsyth’s Generation Game, with a teams loosely led by each of the performers and a series of science and maths questions to answer. The voting handsets led each member of the audience have their say, with a number of trick questions to add to the fun.

So far, so fairly ordinary in the world of voting. But what made it more interesting was another part of the performance where teams used their mobiles to play pong against each other using crowd-sourcing to aggregate the individual commands to move the bat up or down.

We started talking about how the voting handsets might be used within this. TurningPoint do provide an SDK, and with this it should be possible to use the handsets as controllers for pretty much any application. It turns out that a colleague, Daniel Richardson, has already done this, using voting handsets to control a crowd tightrope walking game.

So, what else could we do with the handsets. Well, lots. For Economics, how about a simulation where different teams play the Treasury, Bank of England, Banks etc in a simulation of the economy. Controlling machinery in Engineering. Determining the functioning of the human body in medicine.

We could take it further. Rather than having each handset being an equal partner in the crowd-sourcing efforts, we could plant catalysts or decoys to simulate real world phsychology or group behaviour, or disease or system failure.

There is lots of potential here to go beyond simple multiple choice questions to involve the audience in dymamic live simulations, games and experiments. I’m quickly discovering that there’s much more to these simple handsets than I ever realised.

Provost announces this year’s teaching awards – using learning technology?

By Matt Jenner, on 8 February 2011

This week the Provost has requested nominations for this year’s teaching awards to ‘recognise those who make an outstanding contribution to teaching at UCL.’ This year there will be ten awards handed out and the winners receive some well-deserved respect for the hard work they put in. The deadline for nominations is 15th March.

But how does this relate to Learning Technology? Well, in addition to the innovative thinking, hard work, extra time, dedication and commitment to excellent teaching at UCL many of the winners from the past years have been leading the way forward by incorporating e-learning or learning technology into the heart of education. Past winners have made particularly good use of technologies such as Moodle or Electronic Voting Handsets and embedded them right into the curriculum.

The Learning Technology Support Service would always welcome anyone from the UCL community to get in touch with us and see how we can work together to try new things, or perhaps even try old things which we know work well – but still might be new to you!

Links:

LTSS

Provost Teaching Awards

Moodle

Electronic Voting Handsets