E-Learning Environments team blog
  • ELE Group
    We support Staff and Students using technology to enhance teaching & learning.

    Here you'll find updates on developments at UCL, links & events as well as case studies and personal experiences. Let us know if you have any ideas you want to share!

  • Subscribe to the ELE blog

  • Meta

  • Tags

  • A A A

    Archive for the 'Electronic voting systems' Category

    HEA Senior Fellowship Case Study Series: 4 – Researching learner interaction and engagement with in-class response systems

    By Matt Jenner, on 15 August 2014

    As a four-part series I am openly publishing my case studies previously submitted for my Senior Fellowship of the Higher Education Academy. I submitted my application in February 2014. If you’re interested in this professional recognition programme, please visit their webpages and look through the Professional Standards Framework (PSF). UCL runs an institutional model for fellowships called ARENA, your institution may run one too – speak to people!

    Case Study 4 – Researching learner interaction and engagement with in-class response systems

    In 2012 I conducted research, in parallel with my job at UCL, focusing on increasing student interaction and staff engagement of an in-class question and response system colloquially known as ‘clickers’. Evidence suggests clickers provide interaction opportunities to stimulate and engage learners[1] and have a benign or positive effect in student performance[2]. Clickers are popular across many disciplines, in particular the physical sciences, but there is a particularly low interest in medical sciences.

    I wanted to directly address this shortcoming so I enlisted two academics in the UCL Medical School. I assimilated the current method of teaching, and the materials used (K1). From here we adapted a learning activity to align with the new tool being applied (A1). I underpinned the use of the technology with existing literature and the evidence of realigning the ‘sage on the stage’ to the ‘guide on the side’ [3](K2), which evidence suggests is an effective method for learning and teaching (K3, V3). I provided pre-lecture technical support to reduce technical barriers and was on-hand in the lecture to support as/when needed (A2). Questions were designed into the lectures and the clickers provide immediate feedback (A3). Staff react to clicker data with an approach called ‘contingent teaching’[4] where they dynamically respond to the answers/feedback provided (A3).

    I designed evaluation questions for each lecture based on Bloom’s Taxonomy[5] for learners-based evaluation of the teaching approach and learning outcomes (A4). Questions were derived from categorising Bloom into three sub-categories; remember or understand, apply or analyse the topic and evaluate or create new knowledge (K5). When questioned, 74% of students agreed or strongly agreed that the clickers and the related teaching approach encouraged interaction and helped to achieve metacognitive learning (K5). I integrated these data with post-lecture interviews for the lecturers. Using this analysis, we designed next steps for future use and identified gaps and areas for improvement (A5).

    I conducted evidence-based research and followed best practice around clickers to ensure inclusion was academically merited (V3). Measuring (and increasing) engagement within the traditional lecture was aiming to promote participation for learners (V2). It was understood that clickers do not directly enhance learning but can lead to higher-order learning. I used my understanding of the wider field of evidence to define their most appropriate use within the lectures (V1, V3).

    By implementing a technology which was new to staff and guiding them with appropriate techniques known to increase interaction and engagement, I provided an evidence-informed approach which could be used to transform didactic content delivery into something more engaging. My research adds to a disproportionately small body of knowledge for clickers in medical education and the study overall was positive. Staff involved still use the clickers, the impact I measured plus the evidence collected, can be further used to promote clickers within UCL, the Medical School and beyond. It earned me a Distinction in my MSc Learning Technologies and furthered my ambition to make a lasting, positive difference to higher education.

    (493 words)

    HEA Professional Standards Framework links referenced in this case study:

    Areas of Activity

    • A1 Design and plan learning activities and/or programmes of study
    • A2 Teach and/or support learning
    • A3 Assess and give feedback to learners
    • A4 Develop effective learning environments and approaches to student support and guidance
    • A5 Engage in continuing professional development in subjects/disciplines and their pedagogy, incorporating research, scholarship and the evaluation of professional practices

    Core Knowledge

    • K1 The subject material
    • K2 Appropriate methods for teaching, learning and assessing in the subject area and at the level of the academic programme
    • K3 How students learn, both generally and within their subject/disciplinary area(s)
    • K5 Methods for evaluating the effectiveness of teaching

    Professional Values

    • V1 Respect individual learners and diverse learning communities
    • V2 Promote participation in higher education and equality of opportunity for learners
    • V3 Use evidence-informed approaches and the outcomes from research, scholarship and continuing professional development


    [1] Bligh, D.A., (2000). What’s the use of Lectures? London/San Francisco; Jossey-Bass

    [2] http://w.lifescied.org/content/6/1/9.short

    [3] King, A. (1993). From Sage on the Stage to Guide on the Side. College Teaching, Vol. 41, No. 1, p30- 35. Taylor & Francis Ltd.

    [4] Beatty I. D., Gerace W. J., Leonard W. J. and Dufresne R. J., (2006). Designing effective questions for classroom response teaching, American Journal of Physics. Vol. 74, p31-39.

    [5] Bloom B.S., (1956). Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.

    Electronic voting and the new desktop

    By Martin Burrow, on 22 November 2013

    Over the summer an new desktop service ‘Desktop@UCL ‘was rolled out to all Cluster room, Lecture Theatre and Kiosk PCs. As part of this project the version of the software used with electronic voting was upgraded from version 4.3.2 (also known as TurningPoint 2008) to version 5.2.1

    If you have a personal installation of TurningPoint 2008, we recommend that you upgrade it to version 5.2.1 The download for TurningPoint 5.2.1 can be found on the Software Database.

     

    Unfortunately presentations created in one version cannot be run in the other version. If you attempt to open a presentation created in TurningPoint 2008, in TurningPoint 5.2.1, it will prompt you to convert the file, which is a one way process. There is no backwards conversion process for presentations created in 5.2.1, back to version 4.3.2 (TurningPoint 2008). If you have presentations created in TurningPoint 2008 that you want to be able to use on either version, then the best advice is to make two copies of the file. Label one ’2008′ and use it with TurningPoint 2008, the other label ’521′ and use with TurningPoint 5.2.1

    There are updated user guides for creating and delivering presentations with the new software here

    Creating a presentation with TurningPoint

    Delivering a presentation with TurningPoint

    Support pages for Electronic voting as a whole are here

    https://www.ucl.ac.uk/isd/staff/e-learning/core-tools/e-voting

    E-Learning Environments is happy to provide 1:1 and small group support. In particular we can usually offer to support staff the first time they use E voting in action, which can provide much reassurance and confidence. We are also happy to advise on ways in which EVS can be used within teaching and on the design of effective voting questions.

    If you have any questions about the use of Electronic Voting then please contact E-Learning Environments.

     

    Interactive lectures in Management Science & Innovation: A pilot evaluation study of LectureTools

    By Vicki Dale, on 12 August 2013

    Jane Britton and Matt Whyndham recently piloted LectureTools with a small group of 17 students in a short course in project management. LectureTools is a cloud-based electronic voting system which students and their teachers can access via their laptops or mobile devices. The system works by importing an existing PowerPoint presentation and then adding interactivity to it, through varied question formats. LectureTools allows students to take notes on their devices alongside each slide; they can also flag when they are confused about a particular slide, or submit questions, which will be displayed on the tutor ‘dashboard’ (see the screenshots below, click on each one to see a full size image).

     

    LectureTools presenter screenshot

    LectureTools presenter interface  (the ‘dashboard’), showing an activity slide at the top, student responses and comprehension on the right, and a panel displaying student questions on the middle left. A preview of the adjacent slides is shown at the bottom of the screen.

     

    LectureTools student screenshot

    LectureTools student interface, showing the PowerPoint slides on the left with the interactive options above and the note-taking area on the right.

     

    As E-Learning Evaluation Specialist within ELE, I carried out an evaluation, gathering data from a range of sources. These included an observation of one of Jane’s interactive lectures and a student questionnaire followed by a focus group discussion with a sample of participants. Both educators were also interviewed at the end of the course. Students rated the system positively overall for stimulating their engagement in the subject, allowing them to measure their understanding, fostering discussion in the classroom and facilitating easy note-taking. In addition, they perceived that it helped them prepare for their forthcoming examination.  Student comments included:

    “I liked the LectureTools a lot.  I’m really impressed by it. It’s so easy to use and so helpful and most of us nowadays work on computers anyway during the lecture so it just makes it easier not to write everything in Word, copy the slides, we have everything on one screen.”

    “We haven’t really asked a question to a lecturer but I think that’s great, that you can write a question and then the lecturer looks there and then they can answer it.”

     

    Both Jane and Matt felt it was helpful to know what students were thinking and to be able to provide timely feedback, although having a class of students all staring at their laptops at various points was initially disconcerting:

    “I think I notice that you get a lot of heads down working all of a sudden and it looks very disconcerting at first … you need to just be aware that they are working and they’re thinking about your stuff but they’re not looking at your face.”

     

    One potential issue that came out of the observation and the survey of students was the opportunity for distraction; this generally happened when students had typed in their responses to open questions and were waiting for other students to ‘catch up’:

    “I do think that the multiple choice questions, or putting the order questions, those are very good ones because all of us answered relatively quickly … so we had no time for distractions but the written ones … when you don’t have anything to do you start to do other things.”

     

    Learning activities need to be carefully structured in order to give students enough time and opportunities to think about their topic, but not so much that they use the laptop to access resources not related to their studies. For this reason, the students and Jane and Matt considered that closed questions such as multiple choice questions might be better than open questions for large lectures.

    A working paper of this study will shortly be uploaded to UCL Discovery.

    E-Learning Environments is working with other staff in various departments around the university to explore the potential of LectureTools to facilitate interactive lectures. If you would like more information or would like to pilot LectureTools or a comparable electronic voting system, please contact myself or Janina Dewitz, our Innovations Officer.

    Voting with PollEverywhere

    By Jessica Gramp, on 18 February 2013

    graphIf you are interested in polling your students, but don’t have access to Electronic Voting Handsets (either installed in a lecture theatre or lent to students) you could use PollEverywhere instead.

    PollEverywhere is an online tool that lets you set up polls that students can answer, either by sending a text message or using the Internet on their laptop or smart-device. It is free for  for up to 40 responses per poll, so for classes larger than this the free option may not be suitable. There are Higher Education plans available for those who need it.  Try it out here: www.polleverywhere.com

     

    Online-only tools (with no text messaging capabilities) exist, but the ones I have looked at have several issues that would prevent me from using them myself or suggesting them to others.

    [edit: list of systems to avoid removed]

    If you are a UCL staff member you can contact E-Learning Environments (ELE) for further information about electronic voting.

    Clickers, clickers, everywhere

    By Matt Jenner, on 2 October 2012

    This summer E-Learning Environments have installed clickers into three teaching spaces at UCL, the Harrie Massey in the Physics Building, the Cruciform LT1 and Christopher Ingold Auditorium. Each room has every seat kitted out with a voting handset and the front teaching PC has a USB receiver and the software installed. Read on for some images and educational musings to chew on…

    (more…)

    Association of Learning Technology Conference (ALT-C) Day 1

    By Jessica Gramp, on 11 September 2012

    In the first plenary session for the ALT-C conference this year, Eric Mazur from Harvard University spoke about how student’s brain activity slows during lectures. The highlighted area to the immediate left of the circled lecture periods in the graph below shows that student’s brains are more active during sleep than during traditional lectures. Eric argues that analysing classroom data is essential to improving teaching.

    Eric Mazur presenting a graph showing the brain activity of students during lectures (circled)

    20120911_101657So how do students actually learn?

    Information transfer is the easy part. The hard part where students need to understand the concepts is often being left to students to do on their own. Eric Mazur realised that most of his own “ah-hah” moments of understanding came outside of the classroom. He now uses voting handsets to involve students in his lectures. After voting he asks students to find someone who disagrees with their answer and then try to convince their neighbour  why their own answer is correct. His collaborative approach to teaching ensures students stay engaged during lectures.

    Women in particular thrive in a collaborative environment as opposed to a competitive one, so they perform better when he involves them in his lectures.  He also encourages students to work together to complete their homework.

    Lecture demonstrations are not as effective as students doing the activity themselves because students may make incorrect assumptions about what the demonstrator has done to achieve the results. Asking students to predict the outcome of the demonstration, record their observation of the demo and then discuss whether they correctly predicted the outcome of the demonstration with their peers leads to a better understanding of the core concepts.

    The reason for this is that “the brain stores models not facts.” You need to give students time to re-adjust their models in the lecture. Otherwise students are more likely to continue to believe in their incorrect models. This effect is known in psychology as cognitive dissonance. Predicting, explaining and discussing the concepts makes a significant difference in the ability for students to absorb the correct models. The graph below shows a significant improvement in understanding by those students who had predicted the demonstration results and an even higher improvement by those who also discussed their predications after the demonstration.

    Eric Mazur showing the improvement in results as students are asked to predict and discuss the results of a demonstration

    2012-09-11 10.57.48

    It’s difficult to teach students who have the wrong model, because teachers who understand the correct model find it difficult to understand where these students are coming from. Asking students to show their working out helps teachers to understand their misconceptions. Instead of just marking incorrect answers as wrong and leaving it at that, Eric Mazur argues that teachers should concentrate on understanding the thinking behind the incorrect answers. That way they can help students to re-adjust their thinking to incorporate the correct models.

    Read more: Classroom Demonstrations: Learning Tools or Entertainment?

    Eric Mazur also asks students to tell him what they find difficult or confusing from their readings before the lecture. He asked students to provide him with at least 2 concepts they found confusing and also some feedback on why they found the items confusing. If they found nothing difficult they had to provide him with two examples of what they found interesting and why. He then adapts his lecture to address the areas students found most difficult to comprehend.  This method is known as just in time teaching. You can find out more about this method in the book Just in time teaching: blending active learning with web technology (Novak et al., Prentice Hall, 1999).

    Eric Mazur’s research shows that confused students are around twice as likely to understand a concept than those who claim they understand it2012-09-11 11.19.24

    In Eric’s study, those students who mentioned they were confused by a concept were roughly twice as likely to demonstrate understanding than those who said they understood it, so “confusion doesn’t correlate with misunderstanding.”  He concluded that those students who claim to understand are likely to have passively read the material instead of properly comprehending it. It’s important to ask students to reflect on what they have read. One way to do this is to ask students to write their own analogy for difficult concepts. Eric Mazur says that “confusion is an essential part of the learning process…and should be elicited.”

    Read more: Understanding Confusion

    More information about Eric Mazur’s research is available from his website: http://mazur.harvard.edu