X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Archive for the 'Learning designs' Category

UCL Engineering’s learning technologist initiative – one year on

By Jessica Gramp, on 9 October 2014

UCL Engineering’s Learning Technologists have been supporting rapid changes within the faculty. Changes include the development of several new programmes and helping the uptake of technology to improve the turnaround of feedback.

In late 2013, the UCL Engineering faculty invested in a Learning Technologist post in order to support the Integrated Engineering Programme (IEP), as well as the other programmes within Engineering departments. Since then two Engineering departments, Science, Technology, Engineering and Public Policy (STEaPP) and Management Science and Innovation (MS&I) have both employed Learning Technologists to help develop their e-learning provision. These posts have had a significant impact on the e-learning activities. To evaluate impact on the student learning experience we are collecting information and feedback from students throughout the academic year.

These three roles complement the UCL-wide support provided by the E-Learning Environments (ELE) team and the Learning Technologists work closely with the central ELE team. This relationship is facilitated by Jess Gramp, the E-Learning Facilitator for BEAMS (Built Environment, Engineering, Maths and Physical Sciences) who co-manages these roles with a manager from each faculty/department. This arrangement enables both formal input from ELE to the departmental activities and plans; and for the learning technologists to receive central mentoring and assistance. Without this structure in place it would be difficult to keep these roles aligned with the many central e-learning initiatives and for the learning technologists to liaise with the technical teams within ISD.

The initiatives developed by these staff include: designing and implementing Moodle course templates; ensuring adherence to the UCL Moodle Baseline; running training needs analysis and developing staff training plans; delivering e-learning workshops; working with staff to redesign courses, as well as developing them from the ground up, to incorporate blended learning principles; delivering one-to-one support; and working with academics on e-learning projects.

Moodle Templates
Engineering now have a Moodle template that provides a consistent experience for students using UCL Moodle to support their learning. This template is now being used on all IEP, MS&I and STEaPP courses and all new Engineering Moodle courses from 2014/15 onwards will also use this template. In some cases the template has been modified to meet departmental requirements.

Engineering Faculty Moodle template (click to enlarge)

Engineering Faculty template

See how MS&I have modified this template and described each feature in their MS&I Moodle Annotated Template document.

Moodle Baseline course audit
In MS&I all Moodle courses have been audited against the UCL Moodle Baseline. This has enabled the department’s Learning Technologist to modify courses to ensure every course in the department now meets the Baseline. The template document that was used to audit the courses has been shared on the UCL E-Learning Wiki, so other departments may use it if they wish to do similar. You can also download it here: Baseline Matrix MSI-template.

Training Needs Analysis
In STEaPP a Training Needs Analysis was conducted using both a survey and interviews with academics to develop individual training plans for academics and run training workshops specific to the department’s needs. The survey used for this has been shared with colleagues on the UCL E-Learning Wiki.

Staff e-learning training and support
In STEaPP a Moodle Staff Hub course has been developed to support staff in their development of courses, including links to e-learning support materials; curriculum development advice; and links to professional development resources. This course has now been duplicated and modified to assist staff across Engineering and within MS&I. If any other UCL faculties or departments would like a similar course set up they can request this be duplicated for them, so they may tailor it to their own requirements. This and other courses are being used to induct new staff to departments and are supported by face to face and online training programmes. The training is delivered using a combination of central ELE training courses and bespoke workshops delivered by Engineering Learning Technologists.

E-assessment tools to improve the speed of feedback to students
In MS&I the average turn around for feedback to students is now just one week, significantly shorter than the four week target set by UCL. In order to support this initiative, the department has adopted a fully online assessment approach. This has been achieved predominately using Turnitin, a plagiarism prevention tool that also provides the ability to re-use comments; use weighted grading criteria to provide consistent feedback to students (in the form of rubrics and grading forms); and mark offline using the iPad app. The use of this tool has helped staff to reach the one week feedback target and to streamline the administrative processes that were slowing the feedback process. The Learning Technologist in MS&I has recently arranged workshops with the majority of MS&I staff (including those who are new to UCL) to demonstrate how Turnitin can be used to deliver feedback quickly to students. Several modules within the IEP are also using Moodle’s Workshop tool to enable peer assessment to be managed automatically online. The use of this and other e-assessment tools is saving academics and support staff significant time that used to be spent manually managing the submission, allocation and marking of assessments.

Technical e-learning support
While the ELE Services team continues to be the main point of contact for technical e-learning support within Engineering, the local Learning Technologists are able to provide just-in-time support for staff working on local projects. The Learning Technologists are also able to provide assistance beyond what is supported by the central E-Learning team. This includes any development work, such as setting up specific tools within Moodle courses (like the Workshop tool for peer assessment) and setting up groups in MyPortfolio. Development work like these activities fall outside the remit of the central E-Learning Environments team. Also, because the Engineering Learning technologists are based within the faculty, they obtain a better knowledge of local working practices, and are therefore better equipped to understand and support department specific requirements than the central team is able to.

Project support and funding
The local Learning Technologists have worked with academics within Engineering to develop bids for Engineering Summer Studentships and other projects, including the E-Learning Development Grants that are distributed yearly by ELE. The successful project proposals have been supported by the local Learning Technologists, which has meant a greater level of support has been provided to the grant winners than has been possible in previous years.

Using technology to support scenario-based learning
The Learning Technologist for STEaPP had a unique opportunity to work with staff during the development of their curriculum to ensure that technology was considered at the very outset of the programme’s development. In MS&I the local Learning Technologist has helped to develop a scenario-based, blended-learning course that is now being used as an exemplar of how other academics may redesign their own courses to empower students in their own learning (both electronically and face to face) and provide authentic learning experiences. Many Engineering programmes are already using project-based work to provide students with authentic learning experiences and assessments and this is something the Learning Technologists can work with academics to develop and enhance further.

Trialing new technologies
Several e-learning systems have been trialed within Engineering significant input from the Engineering Learning Technologists, including the mobile e-voting system (TurningPoint ResponseWare) for up to 1000 students; and peer assessment of upwards of 700 student videos within the IEP. The successful implementation of such large scale trials would have been difficult without the support of the Learning Technologists.

E-Learning equipment loans
One of the common problems with technology uptake is ensuring staff have access to it. Engineering have invested in a number of devices to enable (amongst other things) offline marking; video capture and editing; and presentation of hand drawn figures during lectures. Equipment is available for loan across Engineering and also within STEaPP and MS&I. These include laptops, video recording and editing kit (such as cameras, tripods, microphones and editing software) and iPads. The maintenance and loaning of these are managed by the local Learning Technologists. They are also able to provide advice and assistance with the use of these devices, especially in terms of multimedia creation, including sound recording and filming, and editing of videos to enhance learning resources.

Working closely with E-Learning Environments and each other
One important aspect of these roles is that they have close ties to the ELE team, allowing for important two way communication to occur. The Engineering Learning Technologists are able to keep abreast of changes to centrally supported processes and systems and can obtain support from the central E-Learning Environments Services team when required, including receiving train-the-trainer support in order to run workshops locally within Engineering departments. Similarly, ELE benefit by an improved understanding of the activities occurring within faculties and departments, and accessing the materials that are developed and shared by the Learning Technologists.

Each week the Engineering Learning Technologists share any developments, issues, and updates with each other and the E-Learning Facilitator for BEAMS. The result is a strong network of support for helping to problem solve and resolve issues. It also enables resources, such as the staff hub Moodle course and Moodle auditing matrix, to be shared across the Faculty and more widely across UCL, enabling the re-use of materials and avoiding duplication of effort. The importance of the strong working relationship between the Engineering Learning Technologists became apparent during UCL Engineering’s How to change the world series. During an important final-day session all three Learning Technologists were involved in resolving technical issues to ensure the voting system operated correctly in a venue with incompatible wireless provision.

Conclusion
UCL staff and students today operate within a rapidly changing educational environment. Both staff and students are expected to understand how to use technology in order to operate within an increasingly digital society. There is a huge number of self directed online learning resources available (such as MOOCs and YouTube videos) and increasingly flexible work and study arrangements are being supported by enhanced technology use. As more staff see the benefits that technology can bring to the classroom, and true blended learning becomes the norm in many areas, it is going to be more important to implement appropriate support structures so staff have the resources to understand and work with these emerging technologies. It is equally important that students are supported in their use of these tools.

The Learning Technologists within Engineering are in a unique position to understand the opportunities and issues arising in the classroom, and react to these quickly and effectively. We have already seen numerous outputs from these roles. These include a video editing guide to help academics produce professional looking videos for their students; the use of tools within Moodle and MyPortfolio on a scale not seen before with large cohorts of over 700 IEP students; and an exemplar of how scenario-based learning can be supported by technology in MS&I. While these outputs have been developed in reaction to local needs, they have been shared back for others to use and reference, and therefore they benefit the wider UCL community.

As we see more of these roles implemented across UCL, we will begin to see more dramatic change than has been achievable in the past. One of the plans for the future involves running student focus groups and interviews to better understand how Moodle and other e-learning systems are helping students with their studies and how provision can be improved. The Engineering Learning Technologists will continue their work with local staff to help their departments to use technology more effectively and improve the student experience.

How can e-learning help with student feedback?

By Clive Young, on 20 June 2014

feedbackI attended a teaching and learning meeting in one of our academic departments recently when someone asked if they could use technology to improve their feedback to students. Four possibilities sprang to mind.

Online marking  – As an associate lecturer at the OU I have to use online feedback via standard forms and document mark-up (i.e. comments in Word) is obligatory. After several years I now have a ‘bank’ (personal collection) of comments I can draw on to quickly provide rich personalised feedback. Moreover the OU uses ‘rubrics’ (marking schemes) to structure feedback and make sure it is aligned to the learning outcomes. This improves the efficiency and effectiveness of marking and honestly I’m not sure I could manage without this approach now. Here in UCL many colleagues use Moodle Assignments to allow markers to bulk upload markers’ annotations on files of student work, and to set up rubrics and other structured marksheets.  GradeMark, part of Turnitin, has a particularly convenient marking environment. Its unique selling point is customisable sets of frequently-made comments which provide online marking with drag and drop and general online comments. Comment ‘banks’ are available at a click and a drag, and can be shared across programmes and they can themselves be linked to rubric structures.

Self-assessment – Perennial favourites in UCL student surveys are diagnostic and self-assessment quizzes, usually developed in Moodle’s Quiz, which despite its frivolous name is actually a very sophisticated assessment and feedback tool. Moodle quizzes offer different question types, including multiple choice, gap fill, and drag-and-drop,  all of which can automate giving feedback, based on the settings tutors choose.  For example, feedback could be given immediately a question is answered, or it can be deferred until after the quiz is completed. Students appreciate the chance to check progress and get focused feedback based on the answer they chose.  While writing good questions and feedback takes thought and care , technically quizzes are comparitively easy to set up in Moodle. The trick is to provide good, differentiated feedback, linking to remedial or additional materials that the student can look at straight away. In Moodle these links could be to documents, items on the electronic reading lists, Lecturecast recordings, YouTube videos and so in as well as simple texts and images. Questions can be imported from Word using a template, allowing rapid quiz authoring without an internet connection, and even the Matlab GUI has been used to automatically generate mathematical question banks for later import. As an alternative UCL has also had some success using PeerWise enabling students to design their own multiple-choice questions (MCQs).

Audio and video feedback – One interesting feature of GradeMark is the facility to provide students audio feedback. Staff at UCL have been experimenting with audio feedback for several years, adding live audio comments to text documents or forum posts, for example. The rationale is that feedback is richer, more personal, more expressive of different emphasis, and there is more of it for the amount of time spent. Since it also tends to be less concerned with particularities of grammar, spelling, etc, some markers may want to combine with word-processed annotations. An extension of this is to make a single recording giving general feedback to an entire cohort on a given piece of work. And an extension of this idea is to create simple narrated screencasts using Lecturecast personal capture (desktop recording) to record worked examples for generic assessment and exam feedback. This approach has been tried in at least one department with positive results.

Peer assessment and MyPortfolio – Technology can of course provide whole new ways to enable group assessment and the development of rich personal portfolios, for example using the increasingly popular UCL portfolio and presentation environment MyPortfolio.  For an excellent introduction, have a look at the recent UCL case study Making history with iPads, peer assessment and MyPortfolio.

Further reading

Image “Got Feedback?” by Alan Levine https://flic.kr/p/nKPbtE

Digital Literacies special interest group (SIG) meeting – November 2013

By Jessica Gramp, on 28 November 2013

Digital Literacies at UCLFifteen academic and support staff from across UCL met for the first UCL Digital Literacies special interest group (SIG) on Wednesday 27th November.   Jessica Gramp, form E-Learning Environments, delivered a presentation prepared in collaboration with Hana Mori, giving the Jisc definition of digital literacies.

We’re not sure about the term – some find it demeaning.  A better term than Digital Literacies is clearly needed so that it doesn’t offend and imply a deficit. There’s also a need to differentiate between kinds of digital literacy. Some areas that have been used at other institutions include: digital identity, managing studies; working in team; using other people’s content responsibly and digitally enhancing job prospects. There was a general consensus that digital literacies need to be embedded, not tagged on as a separate thing to do.

(more…)

Chronogogy – Time-led learning design examples

By Matt Jenner, on 15 November 2013

I recently blogged about a concept called chronogogy; a time-led principle of learning design the importance of which I’m trying to fathom. My approach is to keep blogging about it & wait until someone picks me up on it. Worst case, I put some ideas out in the public domain with associated keywords etc. Please forgive me.

An example of chronogogically misinformed learning design

A blended learning programme makes good use of f2f seminars. Knowing the seminar takes at least an hour to get really interesting, the teacher prefers to use online discussion forums to seed the initial discussions and weed out any quick misgivings. Using a set reading list, before the seminar they have the intention of students to read before the session, be provoked to think about the topics raised and address preliminary points in an online discussion. The f2f seminars are on Tuesdays & student have week to go online and contribute. This schedule is repeated a few times during the twelve week module.

The problem is, only a handful of students ever post online and others complain that there’s “not enough time” to do the task each week. The teacher has considered making them fortnightly, but this isn’t really ideal either, as some may slip behind, especially when this exercise is repeated during the module.

The argument in my previous post was that if the planning of the activity doesn’t correlate well with activity of website users then it may increase the chance of disengagement.

Example learner 1

 

Tues Wed Thurs Fri Sat Sun Mon Tues

Learner1

Task set Reading start Reading finish Contributes to forum Attends seminar

 

If a reading is set on Tuesday completed by Sunday, the learner may only start considering their discussion points on Sunday or Monday night. This will complete the task before Tuesday’s session, but does it make good use of the task?

Example learner 2

 

Tues Wed Thurs Fri Sat Sun Mon Tues

Learner1

Task set Reading start Reading finish Contributes to forum Visitsforum Contributes to forum Attends seminar

The reading is set on Tuesday, completed by Friday, the learner even posts to the forum on Saturday. By Sunday the come back to the forum, there’s not much there. They come back on Monday and can contribute towards Learner 1’s points, but it could be too late to really fire up a discussion. The seminar is the next day, Tuesday which could increase the chance of discussion points being saved for that instead, as the online discussion may not be worth adding to.

These are two simplistic example, but they provide further questions:

  • Q: Can these two students ever have a valuable forum discussion?
  • Q: Is this was scaled up would the night before the seminar provide enough time for a useful online discussion?
  • Q: If Learner3 had read the material immediately and posted on the Wednesday what would’ve been the outcome?

Any students posting earlier in the seven-day period may be faced with the silence of others still reading. Postings coming in too late may be marred by the signs that fewer visitors will log on during the weekend. Therefore, unless people are active immediately after the seminar (i.e. read and post in the first day or two) then any online discussions takes place on Monday – the day before the seminar.

In this example a lot of assumptions are made, obviously, but it could happen.

Development/expansion

If this example were true, and it helps if you can believe it is for a moment, then what steps could be taken to encourage the discussion to start earlier?

One thought could be to move the seminar to later in the week, say Thursday or Friday. By observing learners behaviour ‘out of class’ (offline and online) it could give insight into the planning of sessions and activities. In the classic school sense, students are given a piece of homework and they fit it in whenever suits them. However, if that work is collaborative, i.e. working in a group or contributing towards a shared discussions, then the timing of their activity needs to align with the group, and known timings that are most effective.

Time-informed planning

Muffins got tired waiting for fellow students to reply to his post.

Muffins got tired waiting for fellow students to reply to his post.

Knowing study habits, and preferences, for off and on-line study could make a difference here. If the teacher had given the students a different time over the week it might have altered contributions to the task. Data in the previous post indicates that learners access educational environments more in the week than the weekend. An activity given on Friday and expected for Monday seems unfair on two levels; a weekend is an important time for a break and weekends are not as busy as weekdays for online activity.

If the week shows a pattern of access for online, then an online task could be created around the understanding of access patterns. If online tasks are planned around this, then it may affect the outcome.

Does time-informed learning design make a difference?

There’s only one way to know, really, and that’s to perform an experiment around a hypothesis. The examples above were based on a group/cohort discussion & it made a lot of assumptions but it provides a basis of which I wanted to conduct some further research.

Time-based instruction and learning. Is activity design overlooked?

In the examples, the teacher is making an assumption that their students will ‘find the time’. This is perfectly acceptable, but students may better perform ‘time-finding’ when they are also wrapped into a strong schedule, or structure for their studies. Traditionally this is bound to the restrictions of the timetabling/room access, teacher’s duties and the learners’ schedules (plus any other factors). But with online learning (or blended) the timetabling or time-planning duty is displaced into a new environment. This online space is marketed as open, personalised, in-your-own-time – all of which is very positive. However, it’s also coming with the negative aspect of self-organisation and could, possible, be a little too loosely defined. Perhaps especially so when it’s no longer personal, but group or cohort based.

There’s no intention here of mandating when learners should be online – that’s certainly not the point. In the first instance it’s about being aware of when they might be online, and better planning around that. In this first instance, the intention is to see if this is even ‘a thing to factor in’.

Chronology is the study of time. Time online is a stranger concept than time in f2f. For face to face the timing of a session is roughly an hour, or two. Online it could be the same, but not in one chunk. Fragmentation, openness and flexibility are all key components – learners can come and go whenever they like, and our logs showing how many UK connections are made to UCL Moodle at 3-5AM show this quite clearly.

Chronogogy is just a little branding for the foundation of the idea that instructional design, i.e. the planning and building of activities for online learning, may need to factor time into the design process. This isn’t to say ‘time is important’ but that by understanding more about access patterns for users, especially (but not necessarily only) online educational environments, could influence the timing and design of timing for online activities. This impact could directly impact the student and teacher experiences. This naturally could come back into f2f sessions too, where the chronogogy has been considered to ensure that the blended components are properly supporting the rest of the course.

Time-led instructional design, or chronogogically informed learning design could potentially become ever more important if considering fully online courses that rely heavily on user to user-interaction as a foundation to the student experience. For example the Open University who rely heavily on discussion forums or MOOCs where learner to learner interaction is the only viable form.

Most online courses would state that student interaction is on the critical path to success. From credit-bearing courses to MOOCs – it’s likely that if adding chronogogy into the course structure, then consideration can inform design decisions early in the development process. This would be important when considering:

  • Planned discussions
  • Release of new materials
  • Synchronous activities
  • Engagement prompts*

In another example, MOOCs (in 2013) seem to attract a range of learners. Some are fully engaged, participate in all the activities, review all the resources and earn completion certificates. Others do less than this, lurking in the shadows as some may say, but remain to have a perfectly satisfactory experience. Research is being performed into these engagement patterns and much talk of increasing retention has sparked within educational and political circles, for MOOCs and Distance Learning engagement/attrition.

One factor to consider here is how you encourage activity in a large and disparate group. The fourth point above, engagement prompts, is a way of enticing learners back to the online environment. Something needs to bring them back and this may be something simple like an email from the course lead.  Data may suggest that sending this on a Saturday could have a very different result than on a Tuesday.

Engagement prompts as the carrot, or stick?

Among many areas till to explore is that if learners were less active over the weekends, for example, then would promoting them to action – i.e. via an engagement prompt, provide a positive or negative return? This could be addressed via an experiment.

Concluding thoughts

I seem interested in this field, but I wonder of its true value. I’d be keen to hear you thoughts. Some things for me to consider are:

  • If there’s peaks and troughs in access – what would happen if this could be levelled out?
  • How could further research be conducted (live or archive data sets).
  • Have I missed something in the theory of learning design that is based on time-led instruction?
  • I wonder what learners would think of this, canvas for their opinions.
  • Could I make a small modification to Moodle to record data to show engagement/forum posting to create a more focused data set?
  • Am I mad?

 

 

New UCL Moodle baseline

By Jessica Gramp, on 12 November 2013

MoodleThe UCL Moodle Baseline that was approved by Academic Committee in June 2009, has now been updated after wide consultation on best current UCL practice.  The aim of the Baseline is to provide guidelines for staff to follow when developing Moodle courses in order for UCL students to have a consistently good e-learning experience. They are intended to be advisory rather than prescriptive or restrictive. These recommendations may be covered within a combination of module, programme and departmental courses.

Changes include the addition of a course usage statement explaining how students are expected to use their Moodle course. A communications statement is also now a requirement, in order to explain to students how they are expected to communicate with staff, and how often they can expect staff to respond. It is now a recommendation for staff to add (and encourage their students to add) a profile photograph or unique image, to make it easier to identify contributors in forums and other learning activities.

New guidelines for including assessment detail and Turnitin guidance have been added for those who use these technologies.

See the new UCL Moodle Baseline v2

Find out more about this and other e-learning news in the monthly UCL E-Learning Champions’ Newsletter.

Engagement! A tale of two MOOCs

By Clive Young, on 13 October 2013

psychology02What is the real educational experience of MOOC students? Some people seem to take strong positions on MOOCs without actually having completed one, after just ‘dipping in’. I felt this was not quite enough to judge what MOOC learning is about, so back in August I signed up two MOOCs running almost concurrently. Both were on the Coursera platform and both  – coincidentally – from Weslyan University. Modernism and Postmodernism is 14 weeks long , is still running and Social Psychology at a sprightly – and more normal – six weeks finished recently. I had actually completed a Coursera MOOC at the beginning of the year but as it was on a familiar subject I considered that taking subjects I knew little about would give me a more ‘authentic’ learner experience.

signatureI thought it was important to avoid ‘dip-in-ism’ so I committed to completing both, even paying $40 to go on the Signature Track on the first one. This means Coursera verifies my identity when I submit assignments, both by typing pattern and face recognition. To set up face recognition initially I held my passport in front of the laptop camera and it scanned my photo. For typing recognition a short phrase is tapped out; Coursera now knows what a dismal typist I am.

Both courses were based around an hour or so of weekly video lectures but despite being out of the same stable, they turned out to be very different in design.

modernism01Modernism and Postmodernism was/is perhaps most ‘conventional’. Each week there were four to six short video lectures and a couple of original texts as assigned readings. That was it. The videos featured Weslyan president and star lecturer Prof Michael Roth. Most were professionally shot, though sometimes interspersed with lecture capture type clips from some of his classes. What was unexpected here was the quality of the video – although nice – was largely immaterial. The power and engagement was simply in Prof Roth’s remarkable narrative, essentially the story of modern Western thought since the Enlightenment and expressed in the works of Kant, Rousseau, Marx, Darwin, Flaubert, Baudelaire, Woolf and so on, not really a ‘grand narrative’ but a compelling intellectual bricolage. I was genuinely gripped by the story Roth was telling, and sometimes just read the transcripts (much quicker) when I was too busy to watch the video. The eight assessments, 800 word essays, were peer-marked and the twenty or so assignments I have looked at so far in the course are of quite a high academic standard. The peer marking approach is astonishingly valuable, by the way, as the other students usually present the material in a very different way; challenging and reviewing my understanding of that part of the course.

psychology01Social Psychology used video differently. The video of the lecturer in was slightly more ‘amateurish’ but the editing was far more sophisticated. Great effort had been taken to get permission to show and edit in some remarkable clips of experiments (including the infamous Stanford Prison Experiment), TED talks, interviews with psychologists and some public broadcasting documentaries. This was supported by chapter-length PDF extracts from major textbooks and reprints of papers. Together this was an astonishingly rich learning resource, the best I have seen on any online course, including many paid-for ones. Like the other course, the tutor voice of Prof Scott Plous was very clear and engaging but his written assignments were more diverse; reactions to an online survey, analysis of a web site and the ‘day of compassion’. The assignments – also peer-marked – were less good than the Modernism course but improved as the ‘drop-ins’ dropped out. The final assignments I read on compassion, from students in India, the Philippines and so on were genuinely moving. The idea that MOOCs encourage a superficial form of learning is misplaced, at least in this case. Participants had evidently reflected, sometimes quite deeply, on the sometimes challenging material.

psychology03Engagement and interaction In neither course did I especially follow the discussion threads, they were too fragmented. Social Psychology for example had 200,000 enrolments, 7000 forum posts in the first week and about 8000 students still active at the end. How can you have a ‘conversation’ in that environment? It made me wonder if ‘interaction’ our much-vaunted goal of many online courses is slightly overrated. Much more motivating to me as student was the strength of the narrative, the storyline, a bit like reading a good book in fact. Video proved an excellent way of getting that narrative across and the assignments in both made sure I assimilated at least some of the content and provided an important time frame to ensure I ‘kept up’. This ‘interaction light’ approach seemed to be in contrast with the Open University courses I have done, and indeed tutor on. These are deliberately designed around a series of regular interactions with fellow students and tutors and, being written by a teaching team, have a far less imposing narrative personality. Maybe in the MOOC environment, where ‘classical’ online interaction is necessarily weaker, design may necessarily focus not simply on interaction but engagement and that strong personal narrative may often be a key element. Just ‘dipping into’ a MOOC may completely miss this most important aspect.