E-Learning Environments team blog
  • ELE Group
    We support Staff and Students using technology to enhance teaching & learning.

    Here you'll find updates on developments at UCL, links & events as well as case studies and personal experiences. Let us know if you have any ideas you want to share!

  • Subscribe to the ELE blog

  • Meta

  • Tags

  • A A A

    Archive for the 'Learning designs' Category

    ABC (Arena Blended Connected) curriculum design

    By Natasa Perovic, on 9 April 2015

    The ABC curriculum design method is a ninety-minute hands-on workshop for module (and programme) teams. This rapid-design method starts with your normal module (programme) documentation and will help you create a visual ‘storyboard’. A storyboard lays out the type and sequence learning activities required to meet the module’s learning outcomes and how these will be assessed. ABC is particularly useful for new programmes or those changing to an online or a more blended format.

    The method uses an effective and engaging paper card-based approach based on research from the JISC* and UCL IoE**. Six common types of learning activities are represented by six cards. These types are acquisition, inquiry, practice, production, discussion and collaboration.

    learning_types_all_cards

    The team starts by writing a very short ‘catalogue’ description of the module to highlight its unique aspects. The rough proportion of each type is agreed (e.g. how much practice, or collaboration) and the envisaged blend of face-to-face and online.

    curriculum_cards_m

    Next the team plan the distribution of each learning type by arranging the postcard-sized cards along the timeline of the module. With this outline agreed participants turn over the cards. Each card lists online and conventional activities associated with each learning types and the team can pick from this list and add their own.

    workshop team selecting activities

    The type and range of learner activities soon becomes clear and the cards often suggest new approaches. The aim of this process is not to advocate any ‘ideal’ mix but to stimulate a structured conversation among the team.

    Participants then look for opportunities for formative and summative assessment linked to the activities, and ensure these are aligned to the module’s learning outcomes.

    assessment

     

    The final stage is a review to see if the balance of activities and the blend have changed, agree and photograph the new storyboard. graph_s

    The storyboard can then be used to develop detailed student documentation or outline a Moodle course (a module in Mooodle).

     

    curriculum_final

    The ABC team is developing a program-level version based on the Connected Curriculum principles.

    Participants’ thoughts about ABC curriculum design workshop:

    Refererences:

    *Viewpoints project JISC

    **UCL IoE: Laurillard, D. (2012). Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology. New York and London: Routledge.

    Etymology of the e- in e-learning? Get out.

    By Matt Jenner, on 12 January 2015

    Based on a Christmas conversation about the etymology of emotion (e- = out, motion = move) my mum blurted out, “ah yes, like e-learning?” I wish! The idea of an externalised expression of one’s own learning, a variant on ‘visible learning’ as a colleague would put it, sounds like a no-brainer. I fear, however, that I must have either never clearly explained what e-learning means to my own mother and perhaps I’ve never really thought about it that much myself.

    Electronic-learning

    I presume e-learning meant the same as email, but evidence suggests it might not be. Electronic-learning, mail, commerce or cigarettes are not necessarily using the same e-concatenation. Wikipedia didn’t have the origin or etymology of e-learning, so in true journalistic style, I added the following:

    “The origin or etymology of e-learning is contested, with the e- part not necessarily meaning electronic as per e-mail or e-commerce. Coined between 1997 and 1999, e-learning became first attached to either a distance learning service or it was used for the first time at the CBT systems seminar. Since then the term has been used extensively to describe the use of online, personalised, interactive or virtual education.”

    Others in the educational technology space have suggested more expressive terms for the mysterious e-. These include “exciting, energetic, enthusiastic, emotional, extended, excellent, educational” by Bernard Luskin or “everything, everyone, engaging, easy” by Eric Parks. If there’s no correct answer, we should enjoy that for as long as it lasts. There’s roots into historical computing and educational theory, but the term e-learning doesn’t even seem that old, which is surprising.

    Externalising learning

    In my experience; too much ‘e-learning’ is still long, scrolling pages of PDFs ad infinitum, raw materials made available via online tools and networks. If it’s supporting traditional face to face, I can live with it. But it’s not learning, not without well-constructed, meaningful learning outcomes and activities. Learning outcomes are critical, they link these resources into genuine learning activities that ‘make visible’ or indeed, put an ‘out’ type of e- into e-learning.

    In an online learning environment how do you, or a learner, know anyone has learnt, or done, anything? Externalised learning is surely the key. The idea of ‘making visible’ is critically important, learners should probably not work in isolation for too long. Personal study can still be highly interactive, and have ample opportunities to externalise thoughts, developments, questions, ideas etc. This is all done via the ‘out’, the externalised visible learning.

    Getting there – the importance of learning outcomes

    I’ve seen far too many course descriptions where the learning outcome is to ‘To be able to understand concept X’. Below is an example of how learning outcomes can vary, while all trying to achieve the same goal.

    Example

    By the end of this program, successful students will:

      Learning Outcome Analysis
    Option 1: Not an outcome Be given opportunities to learn effective communication skills Describes program content, not the attributes of successful students
    Option 2: Vague Have a deeper appreciation for good communication practices Does not start with an action verb or define the level of learning; subject of learning has no context and is not specific
    Option 3: Less vague Understand principles of effective communication Starts with an action verb, but does not define the level of learning; subject of learning is still too vague for assessment
    Option 4: Specific Communicate effectively in a professional environment through technical reports and presentations Starts with an action verb that defines the level of learning; provides context to ensure the outcome is specific and measurable

    Source – Examples of Learning Outcomes: Good and Bad

    I’m always so happy when I see one that even includes a challenging verb like analyse, classify, interpret, define, create or evaluate and more, more, more, more, etc.

    Writing good outcomes – the foundations of learning

    Writing good learning outcomes still seems like a continuous struggle, but it will be cracked. It will then result in improved online learning environments, structured learning, planned activities and more visible ‘out’ for the e- in e-learning. Or, well, that’s the plan.

    It’s in your Job Description

    Hopefully by next Christmas I’ll be able to explain to my mother what I do for a living, but she still thinks I work in IT. Which reminds me, I don’t think I finished updating her virus definitions either
    :-(

     

    Image credit:

    [1] – Out of my mind 2 – Creative Commons openclipart / Creator: mondspeer

    UCL Engineering’s learning technologist initiative – one year on

    By Jessica Gramp, on 9 October 2014

    UCL Engineering’s Learning Technologists have been supporting rapid changes within the faculty. Changes include the development of several new programmes and helping the uptake of technology to improve the turnaround of feedback.

    In late 2013, the UCL Engineering faculty invested in a Learning Technologist post in order to support the Integrated Engineering Programme (IEP), as well as the other programmes within Engineering departments. Since then two Engineering departments, Science, Technology, Engineering and Public Policy (STEaPP) and Management Science and Innovation (MS&I) have both employed Learning Technologists to help develop their e-learning provision. These posts have had a significant impact on the e-learning activities. To evaluate impact on the student learning experience we are collecting information and feedback from students throughout the academic year.

    These three roles complement the UCL-wide support provided by the E-Learning Environments (ELE) team and the Learning Technologists work closely with the central ELE team. This relationship is facilitated by Jess Gramp, the E-Learning Facilitator for BEAMS (Built Environment, Engineering, Maths and Physical Sciences) who co-manages these roles with a manager from each faculty/department. This arrangement enables both formal input from ELE to the departmental activities and plans; and for the learning technologists to receive central mentoring and assistance. Without this structure in place it would be difficult to keep these roles aligned with the many central e-learning initiatives and for the learning technologists to liaise with the technical teams within ISD.

    The initiatives developed by these staff include: designing and implementing Moodle course templates; ensuring adherence to the UCL Moodle Baseline; running training needs analysis and developing staff training plans; delivering e-learning workshops; working with staff to redesign courses, as well as developing them from the ground up, to incorporate blended learning principles; delivering one-to-one support; and working with academics on e-learning projects.

    Moodle Templates
    Engineering now have a Moodle template that provides a consistent experience for students using UCL Moodle to support their learning. This template is now being used on all IEP, MS&I and STEaPP courses and all new Engineering Moodle courses from 2014/15 onwards will also use this template. In some cases the template has been modified to meet departmental requirements.

    Engineering Faculty Moodle template (click to enlarge)

    Engineering Faculty template

    See how MS&I have modified this template and described each feature in their MS&I Moodle Annotated Template document.

    Moodle Baseline course audit
    In MS&I all Moodle courses have been audited against the UCL Moodle Baseline. This has enabled the department’s Learning Technologist to modify courses to ensure every course in the department now meets the Baseline. The template document that was used to audit the courses has been shared on the UCL E-Learning Wiki, so other departments may use it if they wish to do similar. You can also download it here: Baseline Matrix MSI-template.

    Training Needs Analysis
    In STEaPP a Training Needs Analysis was conducted using both a survey and interviews with academics to develop individual training plans for academics and run training workshops specific to the department’s needs. The survey used for this has been shared with colleagues on the UCL E-Learning Wiki.

    Staff e-learning training and support
    In STEaPP a Moodle Staff Hub course has been developed to support staff in their development of courses, including links to e-learning support materials; curriculum development advice; and links to professional development resources. This course has now been duplicated and modified to assist staff across Engineering and within MS&I. If any other UCL faculties or departments would like a similar course set up they can request this be duplicated for them, so they may tailor it to their own requirements. This and other courses are being used to induct new staff to departments and are supported by face to face and online training programmes. The training is delivered using a combination of central ELE training courses and bespoke workshops delivered by Engineering Learning Technologists.

    E-assessment tools to improve the speed of feedback to students
    In MS&I the average turn around for feedback to students is now just one week, significantly shorter than the four week target set by UCL. In order to support this initiative, the department has adopted a fully online assessment approach. This has been achieved predominately using Turnitin, a plagiarism prevention tool that also provides the ability to re-use comments; use weighted grading criteria to provide consistent feedback to students (in the form of rubrics and grading forms); and mark offline using the iPad app. The use of this tool has helped staff to reach the one week feedback target and to streamline the administrative processes that were slowing the feedback process. The Learning Technologist in MS&I has recently arranged workshops with the majority of MS&I staff (including those who are new to UCL) to demonstrate how Turnitin can be used to deliver feedback quickly to students. Several modules within the IEP are also using Moodle’s Workshop tool to enable peer assessment to be managed automatically online. The use of this and other e-assessment tools is saving academics and support staff significant time that used to be spent manually managing the submission, allocation and marking of assessments.

    Technical e-learning support
    While the ELE Services team continues to be the main point of contact for technical e-learning support within Engineering, the local Learning Technologists are able to provide just-in-time support for staff working on local projects. The Learning Technologists are also able to provide assistance beyond what is supported by the central E-Learning team. This includes any development work, such as setting up specific tools within Moodle courses (like the Workshop tool for peer assessment) and setting up groups in MyPortfolio. Development work like these activities fall outside the remit of the central E-Learning Environments team. Also, because the Engineering Learning technologists are based within the faculty, they obtain a better knowledge of local working practices, and are therefore better equipped to understand and support department specific requirements than the central team is able to.

    Project support and funding
    The local Learning Technologists have worked with academics within Engineering to develop bids for Engineering Summer Studentships and other projects, including the E-Learning Development Grants that are distributed yearly by ELE. The successful project proposals have been supported by the local Learning Technologists, which has meant a greater level of support has been provided to the grant winners than has been possible in previous years.

    Using technology to support scenario-based learning
    The Learning Technologist for STEaPP had a unique opportunity to work with staff during the development of their curriculum to ensure that technology was considered at the very outset of the programme’s development. In MS&I the local Learning Technologist has helped to develop a scenario-based, blended-learning course that is now being used as an exemplar of how other academics may redesign their own courses to empower students in their own learning (both electronically and face to face) and provide authentic learning experiences. Many Engineering programmes are already using project-based work to provide students with authentic learning experiences and assessments and this is something the Learning Technologists can work with academics to develop and enhance further.

    Trialing new technologies
    Several e-learning systems have been trialed within Engineering significant input from the Engineering Learning Technologists, including the mobile e-voting system (TurningPoint ResponseWare) for up to 1000 students; and peer assessment of upwards of 700 student videos within the IEP. The successful implementation of such large scale trials would have been difficult without the support of the Learning Technologists.

    E-Learning equipment loans
    One of the common problems with technology uptake is ensuring staff have access to it. Engineering have invested in a number of devices to enable (amongst other things) offline marking; video capture and editing; and presentation of hand drawn figures during lectures. Equipment is available for loan across Engineering and also within STEaPP and MS&I. These include laptops, video recording and editing kit (such as cameras, tripods, microphones and editing software) and iPads. The maintenance and loaning of these are managed by the local Learning Technologists. They are also able to provide advice and assistance with the use of these devices, especially in terms of multimedia creation, including sound recording and filming, and editing of videos to enhance learning resources.

    Working closely with E-Learning Environments and each other
    One important aspect of these roles is that they have close ties to the ELE team, allowing for important two way communication to occur. The Engineering Learning Technologists are able to keep abreast of changes to centrally supported processes and systems and can obtain support from the central E-Learning Environments Services team when required, including receiving train-the-trainer support in order to run workshops locally within Engineering departments. Similarly, ELE benefit by an improved understanding of the activities occurring within faculties and departments, and accessing the materials that are developed and shared by the Learning Technologists.

    Each week the Engineering Learning Technologists share any developments, issues, and updates with each other and the E-Learning Facilitator for BEAMS. The result is a strong network of support for helping to problem solve and resolve issues. It also enables resources, such as the staff hub Moodle course and Moodle auditing matrix, to be shared across the Faculty and more widely across UCL, enabling the re-use of materials and avoiding duplication of effort. The importance of the strong working relationship between the Engineering Learning Technologists became apparent during UCL Engineering’s How to change the world series. During an important final-day session all three Learning Technologists were involved in resolving technical issues to ensure the voting system operated correctly in a venue with incompatible wireless provision.

    Conclusion
    UCL staff and students today operate within a rapidly changing educational environment. Both staff and students are expected to understand how to use technology in order to operate within an increasingly digital society. There is a huge number of self directed online learning resources available (such as MOOCs and YouTube videos) and increasingly flexible work and study arrangements are being supported by enhanced technology use. As more staff see the benefits that technology can bring to the classroom, and true blended learning becomes the norm in many areas, it is going to be more important to implement appropriate support structures so staff have the resources to understand and work with these emerging technologies. It is equally important that students are supported in their use of these tools.

    The Learning Technologists within Engineering are in a unique position to understand the opportunities and issues arising in the classroom, and react to these quickly and effectively. We have already seen numerous outputs from these roles. These include a video editing guide to help academics produce professional looking videos for their students; the use of tools within Moodle and MyPortfolio on a scale not seen before with large cohorts of over 700 IEP students; and an exemplar of how scenario-based learning can be supported by technology in MS&I. While these outputs have been developed in reaction to local needs, they have been shared back for others to use and reference, and therefore they benefit the wider UCL community.

    As we see more of these roles implemented across UCL, we will begin to see more dramatic change than has been achievable in the past. One of the plans for the future involves running student focus groups and interviews to better understand how Moodle and other e-learning systems are helping students with their studies and how provision can be improved. The Engineering Learning Technologists will continue their work with local staff to help their departments to use technology more effectively and improve the student experience.

    How can e-learning help with student feedback?

    By Clive Young, on 20 June 2014

    feedbackI attended a teaching and learning meeting in one of our academic departments recently when someone asked if they could use technology to improve their feedback to students. Four possibilities sprang to mind.

    Online marking  – As an associate lecturer at the OU I have to use online feedback via standard forms and document mark-up (i.e. comments in Word) is obligatory. After several years I now have a ‘bank’ (personal collection) of comments I can draw on to quickly provide rich personalised feedback. Moreover the OU uses ‘rubrics’ (marking schemes) to structure feedback and make sure it is aligned to the learning outcomes. This improves the efficiency and effectiveness of marking and honestly I’m not sure I could manage without this approach now. Here in UCL many colleagues use Moodle Assignments to allow markers to bulk upload markers’ annotations on files of student work, and to set up rubrics and other structured marksheets.  GradeMark, part of Turnitin, has a particularly convenient marking environment. Its unique selling point is customisable sets of frequently-made comments which provide online marking with drag and drop and general online comments. Comment ‘banks’ are available at a click and a drag, and can be shared across programmes and they can themselves be linked to rubric structures.

    Self-assessment – Perennial favourites in UCL student surveys are diagnostic and self-assessment quizzes, usually developed in Moodle’s Quiz, which despite its frivolous name is actually a very sophisticated assessment and feedback tool. Moodle quizzes offer different question types, including multiple choice, gap fill, and drag-and-drop,  all of which can automate giving feedback, based on the settings tutors choose.  For example, feedback could be given immediately a question is answered, or it can be deferred until after the quiz is completed. Students appreciate the chance to check progress and get focused feedback based on the answer they chose.  While writing good questions and feedback takes thought and care , technically quizzes are comparitively easy to set up in Moodle. The trick is to provide good, differentiated feedback, linking to remedial or additional materials that the student can look at straight away. In Moodle these links could be to documents, items on the electronic reading lists, Lecturecast recordings, YouTube videos and so in as well as simple texts and images. Questions can be imported from Word using a template, allowing rapid quiz authoring without an internet connection, and even the Matlab GUI has been used to automatically generate mathematical question banks for later import. As an alternative UCL has also had some success using PeerWise enabling students to design their own multiple-choice questions (MCQs).

    Audio and video feedback – One interesting feature of GradeMark is the facility to provide students audio feedback. Staff at UCL have been experimenting with audio feedback for several years, adding live audio comments to text documents or forum posts, for example. The rationale is that feedback is richer, more personal, more expressive of different emphasis, and there is more of it for the amount of time spent. Since it also tends to be less concerned with particularities of grammar, spelling, etc, some markers may want to combine with word-processed annotations. An extension of this is to make a single recording giving general feedback to an entire cohort on a given piece of work. And an extension of this idea is to create simple narrated screencasts using Lecturecast personal capture (desktop recording) to record worked examples for generic assessment and exam feedback. This approach has been tried in at least one department with positive results.

    Peer assessment and MyPortfolio – Technology can of course provide whole new ways to enable group assessment and the development of rich personal portfolios, for example using the increasingly popular UCL portfolio and presentation environment MyPortfolio.  For an excellent introduction, have a look at the recent UCL case study Making history with iPads, peer assessment and MyPortfolio.

    Further reading

    Image “Got Feedback?” by Alan Levine https://flic.kr/p/nKPbtE

    Digital Literacies special interest group (SIG) meeting – November 2013

    By Jessica Gramp, on 28 November 2013

    Digital Literacies at UCLFifteen academic and support staff from across UCL met for the first UCL Digital Literacies special interest group (SIG) on Wednesday 27th November.   Jessica Gramp, form E-Learning Environments, delivered a presentation prepared in collaboration with Hana Mori, giving the Jisc definition of digital literacies.

    We’re not sure about the term – some find it demeaning.  A better term than Digital Literacies is clearly needed so that it doesn’t offend and imply a deficit. There’s also a need to differentiate between kinds of digital literacy. Some areas that have been used at other institutions include: digital identity, managing studies; working in team; using other people’s content responsibly and digitally enhancing job prospects. There was a general consensus that digital literacies need to be embedded, not tagged on as a separate thing to do.

    (more…)

    Chronogogy – Time-led learning design examples

    By Matt Jenner, on 15 November 2013

    I recently blogged about a concept called chronogogy; a time-led principle of learning design the importance of which I’m trying to fathom. My approach is to keep blogging about it & wait until someone picks me up on it. Worst case, I put some ideas out in the public domain with associated keywords etc. Please forgive me.

    An example of chronogogically misinformed learning design

    A blended learning programme makes good use of f2f seminars. Knowing the seminar takes at least an hour to get really interesting, the teacher prefers to use online discussion forums to seed the initial discussions and weed out any quick misgivings. Using a set reading list, before the seminar they have the intention of students to read before the session, be provoked to think about the topics raised and address preliminary points in an online discussion. The f2f seminars are on Tuesdays & student have week to go online and contribute. This schedule is repeated a few times during the twelve week module.

    The problem is, only a handful of students ever post online and others complain that there’s “not enough time” to do the task each week. The teacher has considered making them fortnightly, but this isn’t really ideal either, as some may slip behind, especially when this exercise is repeated during the module.

    The argument in my previous post was that if the planning of the activity doesn’t correlate well with activity of website users then it may increase the chance of disengagement.

    Example learner 1

     

    Tues Wed Thurs Fri Sat Sun Mon Tues

    Learner1

    Task set Reading start Reading finish Contributes to forum Attends seminar

     

    If a reading is set on Tuesday completed by Sunday, the learner may only start considering their discussion points on Sunday or Monday night. This will complete the task before Tuesday’s session, but does it make good use of the task?

    Example learner 2

     

    Tues Wed Thurs Fri Sat Sun Mon Tues

    Learner1

    Task set Reading start Reading finish Contributes to forum Visits

    forum

    Contributes to forum Attends seminar

    The reading is set on Tuesday, completed by Friday, the learner even posts to the forum on Saturday. By Sunday the come back to the forum, there’s not much there. They come back on Monday and can contribute towards Learner 1’s points, but it could be too late to really fire up a discussion. The seminar is the next day, Tuesday which could increase the chance of discussion points being saved for that instead, as the online discussion may not be worth adding to.

    These are two simplistic example, but they provide further questions:

    • Q: Can these two students ever have a valuable forum discussion?
    • Q: Is this was scaled up would the night before the seminar provide enough time for a useful online discussion?
    • Q: If Learner3 had read the material immediately and posted on the Wednesday what would’ve been the outcome?

    Any students posting earlier in the seven-day period may be faced with the silence of others still reading. Postings coming in too late may be marred by the signs that fewer visitors will log on during the weekend. Therefore, unless people are active immediately after the seminar (i.e. read and post in the first day or two) then any online discussions takes place on Monday – the day before the seminar.

    In this example a lot of assumptions are made, obviously, but it could happen.

    Development/expansion

    If this example were true, and it helps if you can believe it is for a moment, then what steps could be taken to encourage the discussion to start earlier?

    One thought could be to move the seminar to later in the week, say Thursday or Friday. By observing learners behaviour ‘out of class’ (offline and online) it could give insight into the planning of sessions and activities. In the classic school sense, students are given a piece of homework and they fit it in whenever suits them. However, if that work is collaborative, i.e. working in a group or contributing towards a shared discussions, then the timing of their activity needs to align with the group, and known timings that are most effective.

    Time-informed planning

    Muffins got tired waiting for fellow students to reply to his post.

    Muffins got tired waiting for fellow students to reply to his post.

    Knowing study habits, and preferences, for off and on-line study could make a difference here. If the teacher had given the students a different time over the week it might have altered contributions to the task. Data in the previous post indicates that learners access educational environments more in the week than the weekend. An activity given on Friday and expected for Monday seems unfair on two levels; a weekend is an important time for a break and weekends are not as busy as weekdays for online activity.

    If the week shows a pattern of access for online, then an online task could be created around the understanding of access patterns. If online tasks are planned around this, then it may affect the outcome.

    Does time-informed learning design make a difference?

    There’s only one way to know, really, and that’s to perform an experiment around a hypothesis. The examples above were based on a group/cohort discussion & it made a lot of assumptions but it provides a basis of which I wanted to conduct some further research.

    Time-based instruction and learning. Is activity design overlooked?

    In the examples, the teacher is making an assumption that their students will ‘find the time’. This is perfectly acceptable, but students may better perform ‘time-finding’ when they are also wrapped into a strong schedule, or structure for their studies. Traditionally this is bound to the restrictions of the timetabling/room access, teacher’s duties and the learners’ schedules (plus any other factors). But with online learning (or blended) the timetabling or time-planning duty is displaced into a new environment. This online space is marketed as open, personalised, in-your-own-time – all of which is very positive. However, it’s also coming with the negative aspect of self-organisation and could, possible, be a little too loosely defined. Perhaps especially so when it’s no longer personal, but group or cohort based.

    There’s no intention here of mandating when learners should be online – that’s certainly not the point. In the first instance it’s about being aware of when they might be online, and better planning around that. In this first instance, the intention is to see if this is even ‘a thing to factor in’.

    Chronology is the study of time. Time online is a stranger concept than time in f2f. For face to face the timing of a session is roughly an hour, or two. Online it could be the same, but not in one chunk. Fragmentation, openness and flexibility are all key components – learners can come and go whenever they like, and our logs showing how many UK connections are made to UCL Moodle at 3-5AM show this quite clearly.

    Chronogogy is just a little branding for the foundation of the idea that instructional design, i.e. the planning and building of activities for online learning, may need to factor time into the design process. This isn’t to say ‘time is important’ but that by understanding more about access patterns for users, especially (but not necessarily only) online educational environments, could influence the timing and design of timing for online activities. This impact could directly impact the student and teacher experiences. This naturally could come back into f2f sessions too, where the chronogogy has been considered to ensure that the blended components are properly supporting the rest of the course.

    Time-led instructional design, or chronogogically informed learning design could potentially become ever more important if considering fully online courses that rely heavily on user to user-interaction as a foundation to the student experience. For example the Open University who rely heavily on discussion forums or MOOCs where learner to learner interaction is the only viable form.

    Most online courses would state that student interaction is on the critical path to success. From credit-bearing courses to MOOCs – it’s likely that if adding chronogogy into the course structure, then consideration can inform design decisions early in the development process. This would be important when considering:

    • Planned discussions
    • Release of new materials
    • Synchronous activities
    • Engagement prompts*

    In another example, MOOCs (in 2013) seem to attract a range of learners. Some are fully engaged, participate in all the activities, review all the resources and earn completion certificates. Others do less than this, lurking in the shadows as some may say, but remain to have a perfectly satisfactory experience. Research is being performed into these engagement patterns and much talk of increasing retention has sparked within educational and political circles, for MOOCs and Distance Learning engagement/attrition.

    One factor to consider here is how you encourage activity in a large and disparate group. The fourth point above, engagement prompts, is a way of enticing learners back to the online environment. Something needs to bring them back and this may be something simple like an email from the course lead.  Data may suggest that sending this on a Saturday could have a very different result than on a Tuesday.

    Engagement prompts as the carrot, or stick?

    Among many areas till to explore is that if learners were less active over the weekends, for example, then would promoting them to action – i.e. via an engagement prompt, provide a positive or negative return? This could be addressed via an experiment.

    Concluding thoughts

    I seem interested in this field, but I wonder of its true value. I’d be keen to hear you thoughts. Some things for me to consider are:

    • If there’s peaks and troughs in access – what would happen if this could be levelled out?
    • How could further research be conducted (live or archive data sets).
    • Have I missed something in the theory of learning design that is based on time-led instruction?
    • I wonder what learners would think of this, canvas for their opinions.
    • Could I make a small modification to Moodle to record data to show engagement/forum posting to create a more focused data set?
    • Am I mad?