Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

  • Subscribe to the Digital Education blog

  • Meta

  • Tags

  • A A A

    Chronogogy – Time-led learning design examples

    By Matt Jenner, on 15 November 2013

    I recently blogged about a concept called chronogogy; a time-led principle of learning design the importance of which I’m trying to fathom. My approach is to keep blogging about it & wait until someone picks me up on it. Worst case, I put some ideas out in the public domain with associated keywords etc. Please forgive me.

    An example of chronogogically misinformed learning design

    A blended learning programme makes good use of f2f seminars. Knowing the seminar takes at least an hour to get really interesting, the teacher prefers to use online discussion forums to seed the initial discussions and weed out any quick misgivings. Using a set reading list, before the seminar they have the intention of students to read before the session, be provoked to think about the topics raised and address preliminary points in an online discussion. The f2f seminars are on Tuesdays & student have week to go online and contribute. This schedule is repeated a few times during the twelve week module.

    The problem is, only a handful of students ever post online and others complain that there’s “not enough time” to do the task each week. The teacher has considered making them fortnightly, but this isn’t really ideal either, as some may slip behind, especially when this exercise is repeated during the module.

    The argument in my previous post was that if the planning of the activity doesn’t correlate well with activity of website users then it may increase the chance of disengagement.

    Example learner 1


    Tues Wed Thurs Fri Sat Sun Mon Tues


    Task set Reading start Reading finish Contributes to forum Attends seminar


    If a reading is set on Tuesday completed by Sunday, the learner may only start considering their discussion points on Sunday or Monday night. This will complete the task before Tuesday’s session, but does it make good use of the task?

    Example learner 2


    Tues Wed Thurs Fri Sat Sun Mon Tues


    Task set Reading start Reading finish Contributes to forum Visitsforum Contributes to forum Attends seminar

    The reading is set on Tuesday, completed by Friday, the learner even posts to the forum on Saturday. By Sunday the come back to the forum, there’s not much there. They come back on Monday and can contribute towards Learner 1’s points, but it could be too late to really fire up a discussion. The seminar is the next day, Tuesday which could increase the chance of discussion points being saved for that instead, as the online discussion may not be worth adding to.

    These are two simplistic example, but they provide further questions:

    • Q: Can these two students ever have a valuable forum discussion?
    • Q: Is this was scaled up would the night before the seminar provide enough time for a useful online discussion?
    • Q: If Learner3 had read the material immediately and posted on the Wednesday what would’ve been the outcome?

    Any students posting earlier in the seven-day period may be faced with the silence of others still reading. Postings coming in too late may be marred by the signs that fewer visitors will log on during the weekend. Therefore, unless people are active immediately after the seminar (i.e. read and post in the first day or two) then any online discussions takes place on Monday – the day before the seminar.

    In this example a lot of assumptions are made, obviously, but it could happen.


    If this example were true, and it helps if you can believe it is for a moment, then what steps could be taken to encourage the discussion to start earlier?

    One thought could be to move the seminar to later in the week, say Thursday or Friday. By observing learners behaviour ‘out of class’ (offline and online) it could give insight into the planning of sessions and activities. In the classic school sense, students are given a piece of homework and they fit it in whenever suits them. However, if that work is collaborative, i.e. working in a group or contributing towards a shared discussions, then the timing of their activity needs to align with the group, and known timings that are most effective.

    Time-informed planning

    Muffins got tired waiting for fellow students to reply to his post.

    Muffins got tired waiting for fellow students to reply to his post.

    Knowing study habits, and preferences, for off and on-line study could make a difference here. If the teacher had given the students a different time over the week it might have altered contributions to the task. Data in the previous post indicates that learners access educational environments more in the week than the weekend. An activity given on Friday and expected for Monday seems unfair on two levels; a weekend is an important time for a break and weekends are not as busy as weekdays for online activity.

    If the week shows a pattern of access for online, then an online task could be created around the understanding of access patterns. If online tasks are planned around this, then it may affect the outcome.

    Does time-informed learning design make a difference?

    There’s only one way to know, really, and that’s to perform an experiment around a hypothesis. The examples above were based on a group/cohort discussion & it made a lot of assumptions but it provides a basis of which I wanted to conduct some further research.

    Time-based instruction and learning. Is activity design overlooked?

    In the examples, the teacher is making an assumption that their students will ‘find the time’. This is perfectly acceptable, but students may better perform ‘time-finding’ when they are also wrapped into a strong schedule, or structure for their studies. Traditionally this is bound to the restrictions of the timetabling/room access, teacher’s duties and the learners’ schedules (plus any other factors). But with online learning (or blended) the timetabling or time-planning duty is displaced into a new environment. This online space is marketed as open, personalised, in-your-own-time – all of which is very positive. However, it’s also coming with the negative aspect of self-organisation and could, possible, be a little too loosely defined. Perhaps especially so when it’s no longer personal, but group or cohort based.

    There’s no intention here of mandating when learners should be online – that’s certainly not the point. In the first instance it’s about being aware of when they might be online, and better planning around that. In this first instance, the intention is to see if this is even ‘a thing to factor in’.

    Chronology is the study of time. Time online is a stranger concept than time in f2f. For face to face the timing of a session is roughly an hour, or two. Online it could be the same, but not in one chunk. Fragmentation, openness and flexibility are all key components – learners can come and go whenever they like, and our logs showing how many UK connections are made to UCL Moodle at 3-5AM show this quite clearly.

    Chronogogy is just a little branding for the foundation of the idea that instructional design, i.e. the planning and building of activities for online learning, may need to factor time into the design process. This isn’t to say ‘time is important’ but that by understanding more about access patterns for users, especially (but not necessarily only) online educational environments, could influence the timing and design of timing for online activities. This impact could directly impact the student and teacher experiences. This naturally could come back into f2f sessions too, where the chronogogy has been considered to ensure that the blended components are properly supporting the rest of the course.

    Time-led instructional design, or chronogogically informed learning design could potentially become ever more important if considering fully online courses that rely heavily on user to user-interaction as a foundation to the student experience. For example the Open University who rely heavily on discussion forums or MOOCs where learner to learner interaction is the only viable form.

    Most online courses would state that student interaction is on the critical path to success. From credit-bearing courses to MOOCs – it’s likely that if adding chronogogy into the course structure, then consideration can inform design decisions early in the development process. This would be important when considering:

    • Planned discussions
    • Release of new materials
    • Synchronous activities
    • Engagement prompts*

    In another example, MOOCs (in 2013) seem to attract a range of learners. Some are fully engaged, participate in all the activities, review all the resources and earn completion certificates. Others do less than this, lurking in the shadows as some may say, but remain to have a perfectly satisfactory experience. Research is being performed into these engagement patterns and much talk of increasing retention has sparked within educational and political circles, for MOOCs and Distance Learning engagement/attrition.

    One factor to consider here is how you encourage activity in a large and disparate group. The fourth point above, engagement prompts, is a way of enticing learners back to the online environment. Something needs to bring them back and this may be something simple like an email from the course lead.  Data may suggest that sending this on a Saturday could have a very different result than on a Tuesday.

    Engagement prompts as the carrot, or stick?

    Among many areas till to explore is that if learners were less active over the weekends, for example, then would promoting them to action – i.e. via an engagement prompt, provide a positive or negative return? This could be addressed via an experiment.

    Concluding thoughts

    I seem interested in this field, but I wonder of its true value. I’d be keen to hear you thoughts. Some things for me to consider are:

    • If there’s peaks and troughs in access – what would happen if this could be levelled out?
    • How could further research be conducted (live or archive data sets).
    • Have I missed something in the theory of learning design that is based on time-led instruction?
    • I wonder what learners would think of this, canvas for their opinions.
    • Could I make a small modification to Moodle to record data to show engagement/forum posting to create a more focused data set?
    • Am I mad?



    E-Learning Development Grant (ELDG) scheme – 4 years of successful bids!

    By , on 14 April 2011

    2010_ELDG Berlingieri reportFor the past four years we’ve been given the support of the Office of the Vice Provost to ask UCL staff:

    “Would you like to develop the use of e-learning in your teaching?

    Do you have innovative ideas but need support putting them into practice?”

    It’s been those who’ve responded to this call with creativity, vision, and sometimes strong pragmatism that we’ve then worked with as part of UCL’s E-Learning Development Grant (ELDG) scheme.

    This gives funds to further knowledge and experience of e-learning within UCL. It’s previously been used to:

    • support the development of resources
    • evaluation and technology reviews
    • promote innovating teaching methods
    • or even visiting external institutions for inspiration and to compare practice

    A strong part of the ELDG process is to share and learn from these experiences so each year we ask successful bids to report back so that other members of the community can build on them. These reports now span four years in total and will definitely be of interest to applicants for 2011-12 funding or staff looking for inspiration to draw on.  Though many projects are still ongoing it’s been great to review the reports for completed projects so far,  now up on the page for: ELDG Reports Successful bids from previous years

    Though there are more to come, reports from the 2010-11 session have been more detailed than previously. It’s also been the first to encourage video report/presentations, which’ve been particularly engaging and informative and will now probably be a continuing feature of the scheme.

    This year the Office of the Vice Provost (Academic and International) has made available £40,000 to fund ELDG projects, more than ever before. However, this coming academic year is also the culmination of UCL’s efforts to have all taught modules on Moodle to a ‘baseline’ standard. Of course, many modules already on Moodle have been there for some time and have gone well beyond baseline use. Recognising this and encouraging an enhanced use of Moodle is therefore a strong strand in this year’s grants and proposals  including innovative uses of Moodle or combinations with other UCL integrated technologies  are eagerly anticipated.  (See ELDG Themes and inspiration page)

    For those thinking of applying for an ELDG grant the deadline is fast approaching (April 28th!) so we suggest looking over these previous years’ successful bids and themes and ideas page to get some ideas, reading criteria for application and then applying.

    Last year we received over 40 applications so we look forward to seeing what this new year brings in terms of new practice, ideas and innovation!

    If you have any questions do feel free to contact us.

    Video and pedagogy – what questions should we be asking now ?

    By Clive Young, on 10 March 2011

    The third ViTAL webinar on video in education took place on 9 March 2011, attracting 42 attendees and generating a lively discussion.

    It was presented by Clive Young, LTSS and chaired by John Conway, Imperial.

    The slides are here:

    The Adobe Connect recording can be found at the following link

    Video and pedagogy – what questions should we be asking now ?

    A different way to connect.

    By Rod Digges, on 15 November 2010

    Over breakfast at a recent conference on the use of the Echo360 (Lecturecast) system, I found myself talking to an LTA (Learning Technology Advisor) from a small US community college. He had recently been working with teachers from the Math School at the college, helping them transform their existing paper-based courses for online delivery.
    One of the last, and most reluctant, members of staff to go online was a senior member of the school’s teaching staff, who met with the LTA regularly to discuss ideas for the new course. As the course’s live date approached, the LTA suggested that an online discussion forum be included; a place where students could share ideas, or give feedback about the course – the LTA also advised that it was good practice to prime a forum with one or more initial posts to ‘get the ball rolling’. The Maths teacher doubted the value of ‘this kind of thing’ but said that he’d think about it.
    The new term began, the course was made live but it was a couple of weeks before the LTA and the Maths teacher had a chance to meet and review how things were going. When they did finally meet the LTA was pleased to hear that the course had been well received and asked his colleague what he had found most useful.
    The Maths teacher said that he had taken up the suggestion of including a discussion forum and to get the ball rolling had posted the question for all students – ‘What does Maths mean to you in your life?’. This was a question. that over his years of teaching, he always asked every group of students at their first lecture – observing sadly that he rarely got much of a response.
    The teacher said that asking the same question in an online forum had made a big difference, the LTA told me that there were tears in his colleagues eyes as he talked about the many messages in the forum and how a number of students had talked about the beauty and elegance of mathematics, describing a passion for the subject that matched his own – he said the replies had inspired him and that his teaching with this group had an energy and enthusiasm he hadn’t felt for years.

    The Lecturecast conference covered many interesting uses of this very impressive technology, but a few months later, trying to think of subject for this blog, it’s the story of the Maths teacher and his students that sticks in my mind and how the use of a much simpler technology gave them a different way to connect.

    E-assessment 2.0 – making assessment Crisper…

    By Fiona Strawbridge, on 15 September 2010

    CALT organised a stimulating presentation by Prof Geoffrey Crisp of the University of Adelaide about assessment in the Web 2.0 world. Much information at and a similar presentation is on slideshare.

    Crisp calls for much more ‘authentic’ learning and assessment – the need to set big questions; for instance in aeronautical engineering we should set students a task to build a rocket in 3 years. This allows them to see reasons for the smaller things. The tendency with conventional assessment is for everything to become very granular – little learning outcomes are assessed with discrete assessment tasks which don’t encourage students to make connections, and which encourage surface and strategic rather than deep approaches to learning.

    Of course moving away from more traditional forms of assessment entails proving that the alternative works – traditional approaches are very deeply engrained in the culture of institutions and are not easily challenged. Crisp acknowledged that even in his own institution there is some way to go.

    Three points to start with:

    1.    Assessment tasks should be worth doing – if students can get answers by copying from web, or asking google, or guessing, then the task is not worth doing. We need to stop setting tasks which are about information since information is everywhere.

    2.    We should separate out diagnostic assessment from formative assessment. Diagnostic assessment is essential before teaching and can be an excellent way of starting relationship with students at the outset. The teacher can then build their teaching on students’ current level of understanding.

    3.    Think about assessment tasks which result in divergent rather than convergent responses.  In the traditional approach we tend to seek convergent responses in which all students are expected to come up with same answer but divergent responses are more authentic.  Peer- and self-review approaches can support this approach.

    Bearing this in mind, and drawing on the work by Bobby Elliot (see, we heard that:

    • Assessment 1.0 is traditional assessment – paper-based, classroom-based, synchronous in time and space, formalised and controlled.
    • Assessment 1.5 is basic computer assisted assessment – using quizzes which tend to replicate the paper-based experience, and portfolios used mainly as storage for students’ work. Tasks tend to be done alone -competition is encouraged and collaboration is cheating.  They tend to encourage focus on passing the test rather than on gaining knowledge, skills and understanding and don’t lead to deeper levels of learning (indeed Elliot argues that factual knowledge is valueless in the era of Wikipedia and Google.)
    • Assessment 2.0 is tool-assisted assessment in which students do things using a variety of tools and resources and then simply use the VLE (typically) to submit the results. This kind of assessment is typically authentic, personalised, negotiated, engaging, recognises existing skills, researched, assesses deeper levels of learning, problem oriented, collaborative, done anywhere peer- and self-assessed, and supported by IT tools especially the open web.

    Some nice examples of interactive e-assessment 2.0 design included:

    • Examine QuickTime VR image of a geological formation then answer questions based on that – drawing on things wouldn’t be able to see from static image.
    • Examine panograph (scrolling and zoomable image) of Bayeux Tapestry and answer questions drawing together different parts – students selecting evidence from different segments of the tapestry.
    • Interactive spreadsheets – Excel with macros.  Students can change certain bits and answer questions on resulting trends in graphs. Can have nested response questions so that the answer to the second is based on first. (But there is a need for care with dependences so that a wrong move early on doesn’t lead to total failure).
    • Chemical structures using the Molinspiration tool. Students can draw molecular structures using the tool and copy and paste the resulting text string into answer which is held in the VLE quiz tool.
    • Problem solving using a tool called IMMEX (‘It Makes You Think’) which tracks how students approach problems.  The tutor adds in real, redundant and false information that the students can draw on to solve the problem.  They can use it all but the more failed attempts they make the fewer marks they get. We saw an archaeology example in which students had to date an artefact.
    • Role plays which can be done using regular VLE features such as announcements, discussion forums, wikis.  Students adopt different personas and enter into discussion and debate through those personas.
    • Scenario based learning – this is more prescriptive than role play. The recommended tool is
    • Simulations – the site offers a virtual bank and factory. Students can work within bized then answer questions in the VLE.
    • Second Life (virtual world) assessment in which the avatar answers questions which go back into Moodle.

    Examples of these and more are available through the site – it’s Moodle-based and anyone with a email address can self-register and try out the various tasks. (They also run a series of webinars.)

    Crisp argues convincingly for much more authentic and immersive assessment, and for assessments in which  process as well as outcome is evaluated – for example approaches to problem solving;  efficiency; ethical considerations; involvement of others.

    A good closing question was whether teachers will be able to construct future assessments or will this be a specialist activity. Is it all going to get too hard for people? There may be a need for more team based approaches in future.

    Useful resources

    Boud, D., 2009, Assessment 2020 – Seven propositions for assessment reform in higher education, Available at:

    Crisp, G., 2007, The e-Assessment Handbook. Continuum International Publishing Group Ltd

    Crisp, G., 2009, Designing and using e-Assessments. HERDSA Guide, Higher Education Research Society of Australasia

    Elliott, B., 2008. Assessment 2.0 – Modernising assessment in the age of Web 2.0. Available at:

    'Don't lecture me' – the case for abolishing the lecture

    By Fiona Strawbridge, on 14 September 2010

    This year’s ALT-C (Association for Learning Technology) conference opened with a bang – echoes of which continued to resound for the rest of the week. The source was an entertaining and provocative talk by Donald Clark about why the lecture is not fit for purpose (prefaced by an apology for delivering a lecture about why we should not deliver lectures). Clark noted that the word lecture originally meant ‘sermon’ and argued that although some teachers claim interactivity in their lectures often this is an illusion with questions tending to be rhetorical. There is a pretence that critical thinking is being taught; the lecture was never intended to promote critical thinking – it evolved from preaching where the audience was not expected to question material. Clark argues that if you’re going to lecture then make sure you do it well and think about going ‘stadium style’ to huge numbers – using technology to broadcast it if you want.

    Many of Clark’s examples come from physics – a subject he is fascinated by but was unable to engage with as a student. It is difficult to understand and to teach, and so was a good basis for his talk. He recounted how he had embarked on a degree in physics but gave up after a year as he had failed to learn from the lectures he was given – typically a lecturer would walk into the auditorium and write long series’ of equations on panels of blackboards without any real attempt at explanation or engagement. Often all would be done without even facing the students. Clark told how even Isaac Newton was a dreadful teacher – no one would turn up; in contrast Richard Feynman was a superb teacher who cared deeply about teaching; Feynman even published his lectures but was critical of lecture as method. Eric Mazur noticed that lecturers revert to anecdote when talking about teaching (great quote – ‘data is not the plural of anecdote’). Academics can be disparaging about undergraduates; they are typically very far ahead of the undergraduate level – especially in physics – that they can’t easily get back to that level. When thinking about and discussing teaching they typically do not use scientific method and persist in using ineffective approaches. Mazur developed a revised approach involving pre-reading, getting students to take notes, seating weaker students at front and smarter ones at the four corners – this was demonstrated to bring up overall standards.

    The Institute of Theoretical Physics in Trieste recognized the issue of introverted lecturers and adopted a policy of recording all lectures to ensure that students had a second chance to watch. The recording was initially taking photos every few minutes and recording audio throughout. This approach was judged to be a big improvement.

    Clark quoted MIT’s Walter Lewin who has produced some fantastic videos for teaching ‘it’s better to see a first class lecture on video than a mediocre one in the flesh’ – for any teacher a drop off in attendance over the term should ring alarm bells. He also quoted from Donald Bligh’s book ‘ ‘What’s the use of lectures?’ which cites ten pedagogical problems with the lecture:

    1. They tend to be 1 hour.
    2. There is the tyranny of time (ie specific time and place) – why not use YouTube, OCW etc.
    3. And the tyranny of location.
    4. It can be difficult to maintain psychological attention – boredom sets in…
    5. Cognitive overload – too much info too quickly – teachers should simplify materials and pare back radically.
    6. Episodic and semantic memory – lectures shove semantic stuff at you which is unlikely to stick unless something in there is really striking or memorable; teachers need to learn how to use media.
    7. Learning by doing – doesn’t happen in a lecture.
    8. Spaced practice – Ebbingham demonstrated that we need repeated practice to learn – how does this relate to lecture-based learning?
    9. Not collaborative – people don’t want to sit next to each other – as evidenced by the typical practice of leaving an empty space between you and your neighbour – let alone talk to each other in this kind of environment.
    10. Personality problems – we should not don’t assume every teacher has go be a researcher and vice versa; some academics are simply not cut out for teaching.

    Clark’s final words of advice is to aim for transformation of your approach – redesign the whole course, don’t just bolt on technology. Oh and abolish ‘lecturer’ as a job title.
    Twitter – @donaldclark


    The talk was entertaining, lively, thought provoking, and indeed provoked a number of the audience quite effectively – the Twitter backchannel was buzzing with protests that Clark was being condescending to academics, and that whilst some of his criticisms were valid he failed to offer any clear solutions. Some of the tweeting was pretty abrasive and a couple of people managed to put together fairly substantial blog posts outlining their criticisms within hours of the talk.  It did seem – as someone put it – that the wisdom of the crowd had turned into the baying of the twittering mob… Clark himself has posted a blog posting about being tweckled in this way – (and the comments, and his responses…).