E-Learning Environments team blog
  • ELE Group
    We support Staff and Students using technology to enhance teaching & learning.

    Here you'll find updates on developments at UCL, links & events as well as case studies and personal experiences. Let us know if you have any ideas you want to share!

  • Subscribe to the ELE blog

  • Meta

  • Tags

  • Archive for the 'Electronic voting systems' Category

    Helping us to help you

    By Domi C Sinclair, on 16 December 2014

    When you have a problem or question E-Learning Environments (ELE) are always more than happy to hear from you, and will do all we can to help you as quickly as we can. However, this process can be slowed down if we don’t have all the information we need to investigate your problem, or answer your question. So here are some top tips for what to include in an email/ ticket to ELE, so you can help us to help you.

    1. Course name (and link)

    UCL is a large university with hundreds of courses, and even more modules. Therefore it is very difficult for us to investigate a problem without knowing the name of a course/ module, so that we can look at the problem and try to replicate it. A lot of problem solving is reverse engineered, so we will try to replicate the problem for ourselves and then figure out what is wrong, by using our familiarity with the components of the technology. It is also helpful to include a link to the course/ module in question, as sometimes these are not obvious when searching in Moodle/ Lecturecast. Asking for the course name is always our first step, and so by including this in your original email then you will save time and help us resolve the problem faster.

    2. Activity/ resource name (and link)

    As well as there being a lot of courses at UCL, individual courses may have more than one of a particular activity, such as a Turnitin assignment or forum. It will take ELE extra time if we have to search through all of them to find the problem, and it also means that sometimes we are not always sure if we have found the problem. By including the name and location of the activity in the original email ELE can go straight to it, and get to work determining the problem.

    3. Screenshots

    When we look at a course, it might not always be possible for ELE to replicate a problem. This might be because the issue is related to a particular browser you are using, or due to permissions on your account. As these parameters might not apply to ELE we may not be able to see the problem, which makes it much harder for us to help with the answer. If you can take a screenshot (using the PrtScn key) and then paste that into a document and send it as an attachment, it will help us see the problem and any error messages you are receiving. It can even mean that we can answer the question or give a solution straight away upon seeing the screenshot.

    4. Error messages

    Screenshots of error messages are good, but if you can’t take one then including what an error message says will help ELE to diagnose and resolve the problem. It also helps us if we have to deal with any third party suppliers (such as Turnitin).

    4. Specifics

    A summary of the problem is best as ELE might not have a lot of time to read a long email, and it may be possible to determine and resolve an issue with only a few key details, listed above. However it can also help to be specific. If you are reporting a problem then list what steps you are taking that are causing the problem, which buttons are you clicking and in what order? Details are also helpful if you are asking a question about a new activity you’d like to start, but you’re not sure which tool to use. If you include specific details about what you want to do then ELE can suggest the tool that fits your needs best.

    By following these tips you will have an easier and quicker experience with ELE, and we will be able to get through more problems or questions in less time.

    Please feel free to send your queries to ELE via our email address, ele@ucl.ac.uk

    HEA Senior Fellowship Case Study Series: 4 – Researching learner interaction and engagement with in-class response systems

    By Matt Jenner, on 15 August 2014

    As a four-part series I am openly publishing my case studies previously submitted for my Senior Fellowship of the Higher Education Academy. I submitted my application in February 2014. If you’re interested in this professional recognition programme, please visit their webpages and look through the Professional Standards Framework (PSF). UCL runs an institutional model for fellowships called ARENA, your institution may run one too – speak to people!

    Case Study 4 – Researching learner interaction and engagement with in-class response systems

    In 2012 I conducted research, in parallel with my job at UCL, focusing on increasing student interaction and staff engagement of an in-class question and response system colloquially known as ‘clickers’. Evidence suggests clickers provide interaction opportunities to stimulate and engage learners[1] and have a benign or positive effect in student performance[2]. Clickers are popular across many disciplines, in particular the physical sciences, but there is a particularly low interest in medical sciences.

    I wanted to directly address this shortcoming so I enlisted two academics in the UCL Medical School. I assimilated the current method of teaching, and the materials used (K1). From here we adapted a learning activity to align with the new tool being applied (A1). I underpinned the use of the technology with existing literature and the evidence of realigning the ‘sage on the stage’ to the ‘guide on the side’ [3](K2), which evidence suggests is an effective method for learning and teaching (K3, V3). I provided pre-lecture technical support to reduce technical barriers and was on-hand in the lecture to support as/when needed (A2). Questions were designed into the lectures and the clickers provide immediate feedback (A3). Staff react to clicker data with an approach called ‘contingent teaching’[4] where they dynamically respond to the answers/feedback provided (A3).

    I designed evaluation questions for each lecture based on Bloom’s Taxonomy[5] for learners-based evaluation of the teaching approach and learning outcomes (A4). Questions were derived from categorising Bloom into three sub-categories; remember or understand, apply or analyse the topic and evaluate or create new knowledge (K5). When questioned, 74% of students agreed or strongly agreed that the clickers and the related teaching approach encouraged interaction and helped to achieve metacognitive learning (K5). I integrated these data with post-lecture interviews for the lecturers. Using this analysis, we designed next steps for future use and identified gaps and areas for improvement (A5).

    I conducted evidence-based research and followed best practice around clickers to ensure inclusion was academically merited (V3). Measuring (and increasing) engagement within the traditional lecture was aiming to promote participation for learners (V2). It was understood that clickers do not directly enhance learning but can lead to higher-order learning. I used my understanding of the wider field of evidence to define their most appropriate use within the lectures (V1, V3).

    By implementing a technology which was new to staff and guiding them with appropriate techniques known to increase interaction and engagement, I provided an evidence-informed approach which could be used to transform didactic content delivery into something more engaging. My research adds to a disproportionately small body of knowledge for clickers in medical education and the study overall was positive. Staff involved still use the clickers, the impact I measured plus the evidence collected, can be further used to promote clickers within UCL, the Medical School and beyond. It earned me a Distinction in my MSc Learning Technologies and furthered my ambition to make a lasting, positive difference to higher education.

    (493 words)

    HEA Professional Standards Framework links referenced in this case study:

    Areas of Activity

    • A1 Design and plan learning activities and/or programmes of study
    • A2 Teach and/or support learning
    • A3 Assess and give feedback to learners
    • A4 Develop effective learning environments and approaches to student support and guidance
    • A5 Engage in continuing professional development in subjects/disciplines and their pedagogy, incorporating research, scholarship and the evaluation of professional practices

    Core Knowledge

    • K1 The subject material
    • K2 Appropriate methods for teaching, learning and assessing in the subject area and at the level of the academic programme
    • K3 How students learn, both generally and within their subject/disciplinary area(s)
    • K5 Methods for evaluating the effectiveness of teaching

    Professional Values

    • V1 Respect individual learners and diverse learning communities
    • V2 Promote participation in higher education and equality of opportunity for learners
    • V3 Use evidence-informed approaches and the outcomes from research, scholarship and continuing professional development


    [1] Bligh, D.A., (2000). What’s the use of Lectures? London/San Francisco; Jossey-Bass

    [2] http://w.lifescied.org/content/6/1/9.short

    [3] King, A. (1993). From Sage on the Stage to Guide on the Side. College Teaching, Vol. 41, No. 1, p30- 35. Taylor & Francis Ltd.

    [4] Beatty I. D., Gerace W. J., Leonard W. J. and Dufresne R. J., (2006). Designing effective questions for classroom response teaching, American Journal of Physics. Vol. 74, p31-39.

    [5] Bloom B.S., (1956). Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.

    Electronic voting and the new desktop

    By Martin Burrow, on 22 November 2013

    Over the summer an new desktop service ‘Desktop@UCL ‘was rolled out to all Cluster room, Lecture Theatre and Kiosk PCs. As part of this project the version of the software used with electronic voting was upgraded from version 4.3.2 (also known as TurningPoint 2008) to version 5.2.1

    If you have a personal installation of TurningPoint 2008, we recommend that you upgrade it to version 5.2.1 The download for TurningPoint 5.2.1 can be found on the Software Database.

     

    Unfortunately presentations created in one version cannot be run in the other version. If you attempt to open a presentation created in TurningPoint 2008, in TurningPoint 5.2.1, it will prompt you to convert the file, which is a one way process. There is no backwards conversion process for presentations created in 5.2.1, back to version 4.3.2 (TurningPoint 2008). If you have presentations created in TurningPoint 2008 that you want to be able to use on either version, then the best advice is to make two copies of the file. Label one ‘2008’ and use it with TurningPoint 2008, the other label ‘521’ and use with TurningPoint 5.2.1

    There are updated user guides for creating and delivering presentations with the new software here

    Creating a presentation with TurningPoint

    Delivering a presentation with TurningPoint

    Support pages for Electronic voting as a whole are here

    https://www.ucl.ac.uk/isd/staff/e-learning/core-tools/e-voting

    E-Learning Environments is happy to provide 1:1 and small group support. In particular we can usually offer to support staff the first time they use E voting in action, which can provide much reassurance and confidence. We are also happy to advise on ways in which EVS can be used within teaching and on the design of effective voting questions.

    If you have any questions about the use of Electronic Voting then please contact E-Learning Environments.

     

    Interactive lectures in Management Science & Innovation: A pilot evaluation study of LectureTools

    By Vicki Dale, on 12 August 2013

    Jane Britton and Matt Whyndham recently piloted LectureTools with a small group of 17 students in a short course in project management. LectureTools is a cloud-based electronic voting system which students and their teachers can access via their laptops or mobile devices. The system works by importing an existing PowerPoint presentation and then adding interactivity to it, through varied question formats. LectureTools allows students to take notes on their devices alongside each slide; they can also flag when they are confused about a particular slide, or submit questions, which will be displayed on the tutor ‘dashboard’ (see the screenshots below, click on each one to see a full size image).

     

    LectureTools presenter screenshot

    LectureTools presenter interface  (the ‘dashboard’), showing an activity slide at the top, student responses and comprehension on the right, and a panel displaying student questions on the middle left. A preview of the adjacent slides is shown at the bottom of the screen.

     

    LectureTools student screenshot

    LectureTools student interface, showing the PowerPoint slides on the left with the interactive options above and the note-taking area on the right.

     

    As E-Learning Evaluation Specialist within ELE, I carried out an evaluation, gathering data from a range of sources. These included an observation of one of Jane’s interactive lectures and a student questionnaire followed by a focus group discussion with a sample of participants. Both educators were also interviewed at the end of the course. Students rated the system positively overall for stimulating their engagement in the subject, allowing them to measure their understanding, fostering discussion in the classroom and facilitating easy note-taking. In addition, they perceived that it helped them prepare for their forthcoming examination.  Student comments included:

    “I liked the LectureTools a lot.  I’m really impressed by it. It’s so easy to use and so helpful and most of us nowadays work on computers anyway during the lecture so it just makes it easier not to write everything in Word, copy the slides, we have everything on one screen.”

    “We haven’t really asked a question to a lecturer but I think that’s great, that you can write a question and then the lecturer looks there and then they can answer it.”

     

    Both Jane and Matt felt it was helpful to know what students were thinking and to be able to provide timely feedback, although having a class of students all staring at their laptops at various points was initially disconcerting:

    “I think I notice that you get a lot of heads down working all of a sudden and it looks very disconcerting at first … you need to just be aware that they are working and they’re thinking about your stuff but they’re not looking at your face.”

     

    One potential issue that came out of the observation and the survey of students was the opportunity for distraction; this generally happened when students had typed in their responses to open questions and were waiting for other students to ‘catch up’:

    “I do think that the multiple choice questions, or putting the order questions, those are very good ones because all of us answered relatively quickly … so we had no time for distractions but the written ones … when you don’t have anything to do you start to do other things.”

     

    Learning activities need to be carefully structured in order to give students enough time and opportunities to think about their topic, but not so much that they use the laptop to access resources not related to their studies. For this reason, the students and Jane and Matt considered that closed questions such as multiple choice questions might be better than open questions for large lectures.

    A working paper of this study will shortly be uploaded to UCL Discovery.

    E-Learning Environments is working with other staff in various departments around the university to explore the potential of LectureTools to facilitate interactive lectures. If you would like more information or would like to pilot LectureTools or a comparable electronic voting system, please contact myself or Janina Dewitz, our Innovations Officer.

    Voting with PollEverywhere

    By Jessica Gramp, on 18 February 2013

    graphIf you are interested in polling your students, but don’t have access to Electronic Voting Handsets (either installed in a lecture theatre or lent to students) you could use PollEverywhere instead.

    PollEverywhere is an online tool that lets you set up polls that students can answer, either by sending a text message or using the Internet on their laptop or smart-device. It is free for  for up to 40 responses per poll, so for classes larger than this the free option may not be suitable. There are Higher Education plans available for those who need it.  Try it out here: www.polleverywhere.com

     

    Online-only tools (with no text messaging capabilities) exist, but the ones I have looked at have several issues that would prevent me from using them myself or suggesting them to others.

    [edit: list of systems to avoid removed]

    If you are a UCL staff member you can contact E-Learning Environments (ELE) for further information about electronic voting.

    Clickers, clickers, everywhere

    By Matt Jenner, on 2 October 2012

    This summer E-Learning Environments have installed clickers into three teaching spaces at UCL, the Harrie Massey in the Physics Building, the Cruciform LT1 and Christopher Ingold Auditorium. Each room has every seat kitted out with a voting handset and the front teaching PC has a USB receiver and the software installed. Read on for some images and educational musings to chew on…

    (more…)