Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

    Subscribe to our elearning newsletters.

  • Subscribe to this blog

  • Meta

  • Tags

  • A A A

    MoodleMoot 2017: Jo’s reflections

    By Joanna Stroud, on 8 May 2017

    My first two days as Digital Education’s new Distance Learning Facilitator (hi!) were spent at the UK and Ireland edition of MoodleMoot 2017 taking place in London. Presentations ranged from the more technical aspects of Moodle implementation to reports into its more pedagogically-driven uses and impacts. My note-taking over the course of a packed conference schedule was frenzied and now, upon writing this post, occasionally unintelligible, so rather than provide a full overview I’ll reflect upon two presentations in greater detail.

    A Head Start for Online Study: Reflections on a MOOC for New Learners. Presented by Prof. Mark Brown (Dublin City University)
    This project was described by Mark as a means of supporting flexible or distance learners’ transitions into higher education. Despite an established distance learning provision, DCU’s programmes had, like many institutions, experienced higher levels of attrition than those seen with more traditional face-to-face courses. Mark reported that this is largely attributable to the diverse motivations of flexible learners and lack of support at key stages of the study life cycle. DCU thus applied for and gained funding to produce resources that would attempt to bridge these gaps and improve outcomes for flexible learners.

    DCU’s subsequent Student Success Toolbox, containing eight ‘digital readiness’ tools, and the Head Start Online course, piloted on the new Moodle MOOC platform Academy, aim to help potential flexible learners ascertain whether online higher education is right for them, how much time they have and need for study, their sources of support, and the skills they will need to be a successful online learner.

    Mark focused on the outcomes of the Head Start Online pilot course. Of the 151 users registered as part of the pilot, 37 were active after the first week and a total of 24 completed the entire course. However, Mark was keen to stress that learners were not expected to progress through the course in any strict or linear fashion, and completion/non-completion can thus be an unhelpful binary. Feedback from learners proved very positive, with the vast majority believing that they were more ready to become flexible learners, better equipped to manage their time, and more aware of the skills needed for online study after taking the course.

    More information:
    Head Start Online via Moodle Academy
    Student Success Toolbox
    Mark’s presentation from MoodleMoot

    Towards a Community of Inquiry through Moodle Discussion Forums. Presented by Sanna Parikka (University of Helsinki)
    Sanna’s presentation described her use of Moodle discussion forums to facilitate meaningful and constructive online conversations that adhere to the principles of the Community of Inquiry (CoI) framework theory. Use of the CoI framework defines three vital elements of any educational experience as:

    • Social presence: the ability of learners to communicate and engage in social interactions within the learning environment
    • Cognitive presence: the means by which learners can build meaning through reflection and discourse
    • Teaching presence: how we design, facilitate, and guide learners through experiences to achieve the desired learning outcomes.

    Sanna reported upon a range of approaches designed around the CoI framework, suggesting that it is possible to build social presence and give learners the chance to project their personalities online through simple ice breaker activities. Cognitive presence, meanwhile, can be developed through jigsaw learning activities. Cohorts are split into smaller groups of students who discuss and specialise in one specific topic before being redistributed evenly to new forums with specialists from each area and tasked with teaching their new group about their specialism. Teaching presence is built and threaded through each task by providing direct instruction, scaffolding understanding, facilitating discourse, and sharing personal interpretations of meaning.

    Discussion forums are often unfairly criticised, most frequently for lack of student engagement. However, Sanna’s position was that basic interaction is not enough to develop engagement and create new meaning. Her framing and examples of practice underscored the forum as a versatile, flexible means of delivering not just discussion-based tasks but collaborative exercises too.

    More information:
    The Community of Inquiry (Athabasca University)
    M08 Add new learning forums

    Chronogogy – Time-led learning design examples

    By Matt Jenner, on 15 November 2013

    I recently blogged about a concept called chronogogy; a time-led principle of learning design the importance of which I’m trying to fathom. My approach is to keep blogging about it & wait until someone picks me up on it. Worst case, I put some ideas out in the public domain with associated keywords etc. Please forgive me.

    An example of chronogogically misinformed learning design

    A blended learning programme makes good use of f2f seminars. Knowing the seminar takes at least an hour to get really interesting, the teacher prefers to use online discussion forums to seed the initial discussions and weed out any quick misgivings. Using a set reading list, before the seminar they have the intention of students to read before the session, be provoked to think about the topics raised and address preliminary points in an online discussion. The f2f seminars are on Tuesdays & student have week to go online and contribute. This schedule is repeated a few times during the twelve week module.

    The problem is, only a handful of students ever post online and others complain that there’s “not enough time” to do the task each week. The teacher has considered making them fortnightly, but this isn’t really ideal either, as some may slip behind, especially when this exercise is repeated during the module.

    The argument in my previous post was that if the planning of the activity doesn’t correlate well with activity of website users then it may increase the chance of disengagement.

    Example learner 1

     

    Tues Wed Thurs Fri Sat Sun Mon Tues

    Learner1

    Task set Reading start Reading finish Contributes to forum Attends seminar

     

    If a reading is set on Tuesday completed by Sunday, the learner may only start considering their discussion points on Sunday or Monday night. This will complete the task before Tuesday’s session, but does it make good use of the task?

    Example learner 2

     

    Tues Wed Thurs Fri Sat Sun Mon Tues

    Learner1

    Task set Reading start Reading finish Contributes to forum Visitsforum Contributes to forum Attends seminar

    The reading is set on Tuesday, completed by Friday, the learner even posts to the forum on Saturday. By Sunday the come back to the forum, there’s not much there. They come back on Monday and can contribute towards Learner 1’s points, but it could be too late to really fire up a discussion. The seminar is the next day, Tuesday which could increase the chance of discussion points being saved for that instead, as the online discussion may not be worth adding to.

    These are two simplistic example, but they provide further questions:

    • Q: Can these two students ever have a valuable forum discussion?
    • Q: Is this was scaled up would the night before the seminar provide enough time for a useful online discussion?
    • Q: If Learner3 had read the material immediately and posted on the Wednesday what would’ve been the outcome?

    Any students posting earlier in the seven-day period may be faced with the silence of others still reading. Postings coming in too late may be marred by the signs that fewer visitors will log on during the weekend. Therefore, unless people are active immediately after the seminar (i.e. read and post in the first day or two) then any online discussions takes place on Monday – the day before the seminar.

    In this example a lot of assumptions are made, obviously, but it could happen.

    Development/expansion

    If this example were true, and it helps if you can believe it is for a moment, then what steps could be taken to encourage the discussion to start earlier?

    One thought could be to move the seminar to later in the week, say Thursday or Friday. By observing learners behaviour ‘out of class’ (offline and online) it could give insight into the planning of sessions and activities. In the classic school sense, students are given a piece of homework and they fit it in whenever suits them. However, if that work is collaborative, i.e. working in a group or contributing towards a shared discussions, then the timing of their activity needs to align with the group, and known timings that are most effective.

    Time-informed planning

    Muffins got tired waiting for fellow students to reply to his post.

    Muffins got tired waiting for fellow students to reply to his post.

    Knowing study habits, and preferences, for off and on-line study could make a difference here. If the teacher had given the students a different time over the week it might have altered contributions to the task. Data in the previous post indicates that learners access educational environments more in the week than the weekend. An activity given on Friday and expected for Monday seems unfair on two levels; a weekend is an important time for a break and weekends are not as busy as weekdays for online activity.

    If the week shows a pattern of access for online, then an online task could be created around the understanding of access patterns. If online tasks are planned around this, then it may affect the outcome.

    Does time-informed learning design make a difference?

    There’s only one way to know, really, and that’s to perform an experiment around a hypothesis. The examples above were based on a group/cohort discussion & it made a lot of assumptions but it provides a basis of which I wanted to conduct some further research.

    Time-based instruction and learning. Is activity design overlooked?

    In the examples, the teacher is making an assumption that their students will ‘find the time’. This is perfectly acceptable, but students may better perform ‘time-finding’ when they are also wrapped into a strong schedule, or structure for their studies. Traditionally this is bound to the restrictions of the timetabling/room access, teacher’s duties and the learners’ schedules (plus any other factors). But with online learning (or blended) the timetabling or time-planning duty is displaced into a new environment. This online space is marketed as open, personalised, in-your-own-time – all of which is very positive. However, it’s also coming with the negative aspect of self-organisation and could, possible, be a little too loosely defined. Perhaps especially so when it’s no longer personal, but group or cohort based.

    There’s no intention here of mandating when learners should be online – that’s certainly not the point. In the first instance it’s about being aware of when they might be online, and better planning around that. In this first instance, the intention is to see if this is even ‘a thing to factor in’.

    Chronology is the study of time. Time online is a stranger concept than time in f2f. For face to face the timing of a session is roughly an hour, or two. Online it could be the same, but not in one chunk. Fragmentation, openness and flexibility are all key components – learners can come and go whenever they like, and our logs showing how many UK connections are made to UCL Moodle at 3-5AM show this quite clearly.

    Chronogogy is just a little branding for the foundation of the idea that instructional design, i.e. the planning and building of activities for online learning, may need to factor time into the design process. This isn’t to say ‘time is important’ but that by understanding more about access patterns for users, especially (but not necessarily only) online educational environments, could influence the timing and design of timing for online activities. This impact could directly impact the student and teacher experiences. This naturally could come back into f2f sessions too, where the chronogogy has been considered to ensure that the blended components are properly supporting the rest of the course.

    Time-led instructional design, or chronogogically informed learning design could potentially become ever more important if considering fully online courses that rely heavily on user to user-interaction as a foundation to the student experience. For example the Open University who rely heavily on discussion forums or MOOCs where learner to learner interaction is the only viable form.

    Most online courses would state that student interaction is on the critical path to success. From credit-bearing courses to MOOCs – it’s likely that if adding chronogogy into the course structure, then consideration can inform design decisions early in the development process. This would be important when considering:

    • Planned discussions
    • Release of new materials
    • Synchronous activities
    • Engagement prompts*

    In another example, MOOCs (in 2013) seem to attract a range of learners. Some are fully engaged, participate in all the activities, review all the resources and earn completion certificates. Others do less than this, lurking in the shadows as some may say, but remain to have a perfectly satisfactory experience. Research is being performed into these engagement patterns and much talk of increasing retention has sparked within educational and political circles, for MOOCs and Distance Learning engagement/attrition.

    One factor to consider here is how you encourage activity in a large and disparate group. The fourth point above, engagement prompts, is a way of enticing learners back to the online environment. Something needs to bring them back and this may be something simple like an email from the course lead.  Data may suggest that sending this on a Saturday could have a very different result than on a Tuesday.

    Engagement prompts as the carrot, or stick?

    Among many areas till to explore is that if learners were less active over the weekends, for example, then would promoting them to action – i.e. via an engagement prompt, provide a positive or negative return? This could be addressed via an experiment.

    Concluding thoughts

    I seem interested in this field, but I wonder of its true value. I’d be keen to hear you thoughts. Some things for me to consider are:

    • If there’s peaks and troughs in access – what would happen if this could be levelled out?
    • How could further research be conducted (live or archive data sets).
    • Have I missed something in the theory of learning design that is based on time-led instruction?
    • I wonder what learners would think of this, canvas for their opinions.
    • Could I make a small modification to Moodle to record data to show engagement/forum posting to create a more focused data set?
    • Am I mad?

     

     

    E-Learning Development Grant (ELDG) scheme – 4 years of successful bids!

    By , on 14 April 2011

    2010_ELDG Berlingieri reportFor the past four years we’ve been given the support of the Office of the Vice Provost to ask UCL staff:

    “Would you like to develop the use of e-learning in your teaching?

    Do you have innovative ideas but need support putting them into practice?”

    It’s been those who’ve responded to this call with creativity, vision, and sometimes strong pragmatism that we’ve then worked with as part of UCL’s E-Learning Development Grant (ELDG) scheme.

    This gives funds to further knowledge and experience of e-learning within UCL. It’s previously been used to:

    • support the development of resources
    • evaluation and technology reviews
    • promote innovating teaching methods
    • or even visiting external institutions for inspiration and to compare practice

    A strong part of the ELDG process is to share and learn from these experiences so each year we ask successful bids to report back so that other members of the community can build on them. These reports now span four years in total and will definitely be of interest to applicants for 2011-12 funding or staff looking for inspiration to draw on.  Though many projects are still ongoing it’s been great to review the reports for completed projects so far,  now up on the page for: ELDG Reports Successful bids from previous years

    Though there are more to come, reports from the 2010-11 session have been more detailed than previously. It’s also been the first to encourage video report/presentations, which’ve been particularly engaging and informative and will now probably be a continuing feature of the scheme.

    This year the Office of the Vice Provost (Academic and International) has made available £40,000 to fund ELDG projects, more than ever before. However, this coming academic year is also the culmination of UCL’s efforts to have all taught modules on Moodle to a ‘baseline’ standard. Of course, many modules already on Moodle have been there for some time and have gone well beyond baseline use. Recognising this and encouraging an enhanced use of Moodle is therefore a strong strand in this year’s grants and proposals  including innovative uses of Moodle or combinations with other UCL integrated technologies  are eagerly anticipated.  (See ELDG Themes and inspiration page)

    For those thinking of applying for an ELDG grant the deadline is fast approaching (April 28th!) so we suggest looking over these previous years’ successful bids and themes and ideas page to get some ideas, reading criteria for application and then applying.

    Last year we received over 40 applications so we look forward to seeing what this new year brings in terms of new practice, ideas and innovation!

    If you have any questions do feel free to contact us.

    Video and pedagogy – what questions should we be asking now ?

    By Clive Young, on 10 March 2011

    The third ViTAL webinar on video in education took place on 9 March 2011, attracting 42 attendees and generating a lively discussion.

    It was presented by Clive Young, LTSS and chaired by John Conway, Imperial.

    The slides are here:

    The Adobe Connect recording can be found at the following link

    Video and pedagogy – what questions should we be asking now ?


    A different way to connect.

    By Rod Digges, on 15 November 2010

    Over breakfast at a recent conference on the use of the Echo360 (Lecturecast) system, I found myself talking to an LTA (Learning Technology Advisor) from a small US community college. He had recently been working with teachers from the Math School at the college, helping them transform their existing paper-based courses for online delivery.
    One of the last, and most reluctant, members of staff to go online was a senior member of the school’s teaching staff, who met with the LTA regularly to discuss ideas for the new course. As the course’s live date approached, the LTA suggested that an online discussion forum be included; a place where students could share ideas, or give feedback about the course – the LTA also advised that it was good practice to prime a forum with one or more initial posts to ‘get the ball rolling’. The Maths teacher doubted the value of ‘this kind of thing’ but said that he’d think about it.
    The new term began, the course was made live but it was a couple of weeks before the LTA and the Maths teacher had a chance to meet and review how things were going. When they did finally meet the LTA was pleased to hear that the course had been well received and asked his colleague what he had found most useful.
    The Maths teacher said that he had taken up the suggestion of including a discussion forum and to get the ball rolling had posted the question for all students – ‘What does Maths mean to you in your life?’. This was a question. that over his years of teaching, he always asked every group of students at their first lecture – observing sadly that he rarely got much of a response.
    The teacher said that asking the same question in an online forum had made a big difference, the LTA told me that there were tears in his colleagues eyes as he talked about the many messages in the forum and how a number of students had talked about the beauty and elegance of mathematics, describing a passion for the subject that matched his own – he said the replies had inspired him and that his teaching with this group had an energy and enthusiasm he hadn’t felt for years.

    The Lecturecast conference covered many interesting uses of this very impressive technology, but a few months later, trying to think of subject for this blog, it’s the story of the Maths teacher and his students that sticks in my mind and how the use of a much simpler technology gave them a different way to connect.

    E-assessment 2.0 – making assessment Crisper…

    By Fiona Strawbridge, on 15 September 2010

    CALT organised a stimulating presentation by Prof Geoffrey Crisp of the University of Adelaide about assessment in the Web 2.0 world. Much information at http://www.transformingassessment.com and a similar presentation is on slideshare.

    Crisp calls for much more ‘authentic’ learning and assessment – the need to set big questions; for instance in aeronautical engineering we should set students a task to build a rocket in 3 years. This allows them to see reasons for the smaller things. The tendency with conventional assessment is for everything to become very granular – little learning outcomes are assessed with discrete assessment tasks which don’t encourage students to make connections, and which encourage surface and strategic rather than deep approaches to learning.

    Of course moving away from more traditional forms of assessment entails proving that the alternative works – traditional approaches are very deeply engrained in the culture of institutions and are not easily challenged. Crisp acknowledged that even in his own institution there is some way to go.

    Three points to start with:

    1.    Assessment tasks should be worth doing – if students can get answers by copying from web, or asking google, or guessing, then the task is not worth doing. We need to stop setting tasks which are about information since information is everywhere.

    2.    We should separate out diagnostic assessment from formative assessment. Diagnostic assessment is essential before teaching and can be an excellent way of starting relationship with students at the outset. The teacher can then build their teaching on students’ current level of understanding.

    3.    Think about assessment tasks which result in divergent rather than convergent responses.  In the traditional approach we tend to seek convergent responses in which all students are expected to come up with same answer but divergent responses are more authentic.  Peer- and self-review approaches can support this approach.

    Bearing this in mind, and drawing on the work by Bobby Elliot (see http://www.scribd.com/doc/461041/Assessment-20), we heard that:

    • Assessment 1.0 is traditional assessment – paper-based, classroom-based, synchronous in time and space, formalised and controlled.
    • Assessment 1.5 is basic computer assisted assessment – using quizzes which tend to replicate the paper-based experience, and portfolios used mainly as storage for students’ work. Tasks tend to be done alone -competition is encouraged and collaboration is cheating.  They tend to encourage focus on passing the test rather than on gaining knowledge, skills and understanding and don’t lead to deeper levels of learning (indeed Elliot argues that factual knowledge is valueless in the era of Wikipedia and Google.)
    • Assessment 2.0 is tool-assisted assessment in which students do things using a variety of tools and resources and then simply use the VLE (typically) to submit the results. This kind of assessment is typically authentic, personalised, negotiated, engaging, recognises existing skills, researched, assesses deeper levels of learning, problem oriented, collaborative, done anywhere peer- and self-assessed, and supported by IT tools especially the open web.

    Some nice examples of interactive e-assessment 2.0 design included:

    • Examine QuickTime VR image of a geological formation then answer questions based on that – drawing on things wouldn’t be able to see from static image.
    • Examine panograph (scrolling and zoomable image) of Bayeux Tapestry and answer questions drawing together different parts – students selecting evidence from different segments of the tapestry.
    • Interactive spreadsheets – Excel with macros.  Students can change certain bits and answer questions on resulting trends in graphs. Can have nested response questions so that the answer to the second is based on first. (But there is a need for care with dependences so that a wrong move early on doesn’t lead to total failure).
    • Chemical structures using the Molinspiration tool. Students can draw molecular structures using the tool and copy and paste the resulting text string into answer which is held in the VLE quiz tool.
    • Problem solving using a tool called IMMEX (‘It Makes You Think’) which tracks how students approach problems.  The tutor adds in real, redundant and false information that the students can draw on to solve the problem.  They can use it all but the more failed attempts they make the fewer marks they get. We saw an archaeology example in which students had to date an artefact.
    • Role plays which can be done using regular VLE features such as announcements, discussion forums, wikis.  Students adopt different personas and enter into discussion and debate through those personas.
    • Scenario based learning – this is more prescriptive than role play. The recommended tool is Pblinteractive.com
    • Simulations – the Bized.co.uk site offers a virtual bank and factory. Students can work within bized then answer questions in the VLE.
    • Second Life (virtual world) assessment in which the avatar answers questions which go back into Moodle.

    Examples of these and more are available through the http://www.transformingassessment.com/ site – it’s Moodle-based and anyone with a .ac.uk email address can self-register and try out the various tasks. (They also run a series of webinars.)

    Crisp argues convincingly for much more authentic and immersive assessment, and for assessments in which  process as well as outcome is evaluated – for example approaches to problem solving;  efficiency; ethical considerations; involvement of others.

    A good closing question was whether teachers will be able to construct future assessments or will this be a specialist activity. Is it all going to get too hard for people? There may be a need for more team based approaches in future.

    Useful resources

    Boud, D., 2009, Assessment 2020 – Seven propositions for assessment reform in higher education, Available at: http://www.iml.uts.edu.au/assessment-futures/Assessment-2020_propositions_final.pdf

    Crisp, G., 2007, The e-Assessment Handbook. Continuum International Publishing Group Ltd

    Crisp, G., 2009, Designing and using e-Assessments. HERDSA Guide, Higher Education Research Society of Australasia

    Elliott, B., 2008. Assessment 2.0 – Modernising assessment in the age of Web 2.0. Available at: http://www.scribd.com/doc/461041/Assessment-20.