X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Archive for the 'e-Learning Publications' Category

2014 Horizon Report finds six key trends in E-Learning

By Clive Young, on 7 February 2014

Every year the NMC Horizon Report examines emerging technologies for their potential impact on and use in teaching, learning, and ‘creative inquiry’ within the environment of higher education. The report, downloadable in PDF, is compiled by an international body of experts and provides a useful checklist trends, challenges and technologies in the field.

The key trends identified in the in the short term are

  • Growing ubiquity of social media
  • Integration of online, hybrid, and collaborative learning

Longer term trends are: data-driven learning and assessment, shift from students as consumers to students as creators , agile approaches to change and the evolution of online learning

Key short-term challenges are

  • Low digital fluency of faculty
  • Relative lack of rewards for teaching

More difficult challenges are; competition from new models of education, scaling teaching innovations, expanding access and keeping education relevant.

The important developments in educational technology they identify are in the short term are

  • Flipped classroom
  • Learning analytics

Longer-term innovations are; 3D printing, games and gamification, ‘quantified self’ and virtual assistants.

There are useful commenatries and links thoughout. Encouraging that many of these ideas are already being implemented, trialed and discussed here at UCL.

Chronogogy – time-led learning design for online education

By Matt Jenner, on 13 November 2013

If analytic data suggests there is a ‘heartbeat’ of online activity should this inform learning design?

Background

Planning for f2f teaching is largely led by institutional limitations and personal habits. Rooms are booked in 1-hour slots and sessions can only be so-many-hours long. As people can’t stand and talk for hours plus few would sit there and listen for the same period. There’s also only so much time in the day, especially ‘core working hours’. Time for education gets murkier when considering flexible learning, say clinicians who must be in practice between certain times or evening-study students.

Chrono-based design is as old as time itself

Chrono-based design is as old as time itself

In all walks of education, from homework to dissertations, teachers set activities to be completed in student’s own time. Time planning for f2f education is often based on teaching time, set in rooms, schedules, people, slots. Time for learning has to fit into this schedule and is otherwise completed out of these normal hours. Students are expected to complete a substantial amount of personal learning hours – often tied to readings or assessment activities.

Programmes at higher education institutions are increasingly moving into online environments, many of which are still taught in traditional ways. There remains a large focus on face to face teaching and learning activities such as lecture, seminar, lab & essay led teaching. An increasing number are using online learning environments to provide some supportive or supplementary educational value. Some are ‘blended learning’ where elements of the course must be completed online by the learners. Contact time is altered where some is face to face and an amount is also online.  A smaller number are fully online, where the online environment is driving the course, delivering a structured programme of study via resources and activities.

Building for these environments is often a process which involves a significant amount of investment for the teachers and learners.

Teachers

  • Often going alone, designing what works best for their teaching style, their students and making best use of their knowledge of the available tools .
  • Sometimes they may ask for support or advice on best practices, examples, tips and tricks and other approaches improve their original ideas.
  • Some invest additional resource to make larger changes.
  • Courses will always be refreshed over the years, often this comes with a partnership of moving more content online and reworking the existing online content to improve it.

Learners

  • Need to adapt to different approaches of teaching. One part of their course may be very traditional, others may be more online.
  • Method of delivery may influence enrolment decisions.
  • Work/life/study balance & looking for flexibility built into courses.
  • Increasingly using online environments in their daily life.
  • There’s a digital divide between some individuals, some generations and their digital literacies.

Through the techtonic [sic] shifts in education the definition of ‘how people learn’ and ‘good teaching’ remains quite similar to that of 50 or 2000 years ago, and yet still quite hard to define. Many have tried, such as Bloom, Dewey, Paiget, Vygotsky, etc & we should embrace their work. However, it remain somewhat marred by the findings and the reality that most teachers are significantly impacted in how they were taught, and would still reflect this back in their own teaching. (Which is an opportunity for the expansion of innovation, if good teachers influence more good teachers.)

Technology in education has looked at converting the ‘sage on the stage’ to the ‘guide on the side’ for some time. If a good educational experience is about providing agency for individuals to become the best learners they can be, then we need to also reflect that in the design for learning.

Interaction and engagement are often driving factors

The design of online teaching and e-learning is often reserved for academic developers, educational technologists and teachers. Designs often cover what will be taught, intended learning outcomes, design of activities, overall structure & any resources required. Design is often overlooked, and many go directly past the planning phase in favour of the building/development phase. This is perfectly acceptable, especially if on a path over a number of years, increasingly using the affordances of e-learning tools to complement their teaching and learning.

Often skipped, or under-resourced are the steps within the planning phase for a blended or fully online course. This may have more substantial repercussions as skipping design can lead to greater issues later on, which may need to be revisited. Luckily, cyclic design methodologies (whether intentional, or not) are no bad thing. It’s a little chicken & egg and the lack of planning is often due to lack of time across the sector/universe.

Designing a good structure for the course is often one of the first tasks needed. The rest of the course should hang off the back of a good structure:

  • The structure will, particularly with a fully online course, define what needs doing, and when.
  • This is the guide for the students, the stick, the planner, the measure of success and the motivator to stay on track.
  • When thinking of how much time students will spend on tasks, and when they do the task may have been overlooked.
  • A course overview/week-by-week structure is often where the planning of the chronology of the course starts, stops and the rhythm within.

Learning design to incorporate time as a critical factor?

Not factoring in when a learner will engage in an online environment could increasingly become a bigger issue. In an attempt to identify the importance of this issue, this blog post was written.

Chrono-what?

  • Chrono – time
  • Gogy – lead
  • Pedagogy – to lead the child
  • Chronogogy /  chronogogical – to lead by the time, time-led

I felt that this might have significance, and anything of that nature would require promotion within the relevant fields for others to rip it apar, to build retaliating endurance into the concept. After looking for time-influenced learning design in conference proceedings, journals and blogs I found nothing on the subject of time-based instructional/learning design or impact. I had to put a term down to then build upon. Sorry if you don’t like it.

Using captured analytic data to measure ‘visit’ hits & drawing crazy ideas off the back of it

Learning analytics is an emerging field within education where data is used to inform the success, design and evaluation of online learning. In a simplistic model used here, we have taken Google analytics visitor data for one month to attempt and identify if we can see any trends with correlation to learning design. It’s a crude example, but the whole post is based on answering my ‘is this a thing?’ question (it’s bothered me for around six months).

Visits per day, as percentages, over three educatioal websites for February 2013
  • Website 1 – Learning Circuits – an interactive educational resource for 8-10 year old children (I made this a decade ago, still going)
  • Website 2 – UCL Moodle – an institutional online teaching and learning environment
  • Website 3 – UCL.ac.uk – the main UCL website, hosting information about the university.

These data show a regular path of activity for the number of visits to websites across the February time period. The websites are all of an educational nature, but differ in their intended target audiences. Y-axis shows the percentage of the monthly number of visits for that day. X-axis shows the day of the month. The chart clearly shows a rhythm in visits, going up and down in a pattern.

The websites were selected for two reasons

  1. This is an educational observation (but it may be of interest to others if it rung true on other domains)
  2. These websites were the ones the author had access to for analytical information
We can study these data and make several observations across all three domains:
  • There is a distinct shape in visits.

    M-shape of activity

    M-shape of activity

  • There is a regular drop in the weekend, both days seeing less than half of the weekday visits
  • Saturday is the lowest point every week. Sunday is rarely much higher.
  • There is a slight drop on Wednesdays.
  • This month shows a heartbeat shape to the number of visits.
  • There is a slight shaping of an M over the weeks, where single websites, or all together, still create this rough M-shape (shown best in blue)
    • Sunday is the beginning point
    • Monday/Tuesday is the first highest
    • Wednesday shows some drop
    • Thursday marks the second peak
    • Friday is often slightly lower that a Monday or Tuesday counter-part, but still holds up the M-shape
    • Saturday is the lowest point of the week.

Repeating in other months?

February was chosen as a month in the year as it showed steady visits across three educational, but different sites. Each site has a different busy period, as shown below:

Overview of the number of users for UCL Moodle and activity over the year.

Overview of the number of users for UCL Moodle and activity over the year

Overview of the number of users for UCL Moodle and activity over the year

Overview of the number of users for Learning Circuits and activity over the year

Overview of the number of users for Learning Circuits and activity over the year

Overview of the number of users for Learning Circuits and activity over the year

 (Sorry no raw data for UCL.ac.uk)
Note: the M-shape persists across these charts. 

Sticking with February we looked at the same month for the past five years:

Average of visits, as a percentage, for each day, over five years across three sites

Average of visits, as a percentage, for each day, over five years across three sites

This chart shows the percentage of the visits per day of the week, for three websites, over a five year period. The purpose of this chart is to see if the data shown in the first chart, for February 2013, would be repeated over a longer period. The chart is done by day, and not date & the chart runs over fewer days as the first Monday of February would fall on different days, thus shortening the timeframe to evaluate.  The chart shows Saturday having the lowest number of visits over the week, with Sunday resulting in a similar number. The M-shape is less common with Wednesday gaining more visits over a longer time period. The heartbeat over the week, with peaks around Monday/Tuesday and Thursday/Friday remain to show the highest number of visits, especially when compared to weekends.

Out of Winter, across the year

Look across a whole year, in this example 2012, we can see if the data is true across all months and not just February.

One year of daily visits, as percentages, to UCL Moodle

One year of daily visits, as percentages, to UCL Moodle

And for Learning Circuits

One year of daily visits, as percentages, to Learning Circuits

One year of daily visits, as percentages, to Learning Circuits

These two charts show the average (as red) of the percentage of visits over the week. There is no longer an M-shape but do continue to show Saturday and Sunday as the lowest number of visits during the week. Wednesday becomes an increasingly common day over the year for number of visits and for Learning Circuits becomes the most popular day. (This might not have been helped by using Mean numbers and a handful of disproportional and high plots in around week 40 in the year.) UCL Moodle has a similar pattern, with one result much higher above than all others – this is the first week of term in September where the average for the month is very low initially, so on comparison that week is substantially higher. No chart exists for UCL.ac.uk – sorry.

Each of these two charts show the number of visits across two of the sites over a one year period (2012). The intention here is to primarily show that the ‘heartbeat’ of online activity is regular across the year. There are low and high points, but when matched up to the charts above, showing each week’s average, they show that the data analysis, in particular Saturday and Sunday being quiet days, remains true across the year for both domains.

Quantitative vs qualitative

I wonder how long you’ve been thinking ‘he’s not measuring this data very well’. Firstly, I accept all contributions to this. Secondly, this is a desk-based observation, not a research proposal. Any next step would be to review a longitudinal study with an online course, proper data analysis and a real methodology. This is just an idea-incubated post I’m afraid.

Discussion point

Much like the National Grid boost up the power networks when an advertisement break is coming in the nation’s favourite soap operas, could the same be said for a course designer planning their online learning? Perhaps not providing a boost, but instead being aware, and planning for, peaks of online activity?

IF, for example, I were planning an asynchronous activity for my learners would I want to set it for Friday and hope it’s completed by Monday? When would be the best time to plan this?

Most at the moment just set up a week-based activity and hope learners can manage their time effectively around this. However, if the data above can be read into, then more people will be online during the week rather than the weekend. Therefore, it would be best planned over the week, but does this depend on the type of task? What about synchronous activities?

I appreciate this is half-baked but I wanted to share a few simply observations:

  1. Activity online is clearly displayed in analytical review of web access logs
  2. This activity seems to indicate a pattern of peaks and troughs, of a ‘heartbeat’ of online visitor activity (measured in days)
  3. Has time-led instructional design (I like the terms chronogogy of learning design, chronogogical instructional design or chronogogically informed teaching and learning) been undervalued/overlooked in past learning design models for online education?
  4. Does this have a wider impact for online education, including distance learning and MOOCs?

Next steps

I’ve got a few ideas:

  • Talk to fellow educators, technical and less so, ask them if this really has an impact
  • Review course design, basic principles, feed into them the idea of time-based / chronogogical learning design
  • Expand upon this. We have a ‘Moodle archive’ – find a course with an activity like discussion forums and try to match up stored data with analytics information. Does anything correlate? 
  • Build it into a platform and measure the data points over a period of time, for a selection of courses
  • Fester in a basement for six years completing a part-time research project and slowly lose my mind over a somewhat trivial matter.

Closing

If analytic data suggests there is a ‘heartbeat’ of online activity should this inform learning design? I’d like to hear your feedback, as I think should. I’m going to keep looking into it, I just wanted to share some early thoughts with the internet and its people.

edit: sorry, a grammatically-correct friend provided me some advice on lead vs led. People are reading at least!

Innovating pedagogy – 2013 trends report

By Clive Young, on 26 September 2013

innovating_pedagogy2Many of you will be familiar with the annual  Horizon reports, that describe emerging technologies likely to have an impact on learning and teaching. The current 2013 Higher Education report for example lists flipped classrooms, MOOCs, mobile apps and tablet computing as current areas of interest with augmented reality, game-based learning, ‘the internet of things’ and learning analytics as areas to watch out for in the near future. Matt Jenner reflected on the history of Horizons’ trend spotting on this blog earlier this year in When does a technology no longer become a technology?

However there is now a new kid on the trend-spotting block. The Open University this month published the 2013 version of Innovating Pedagogy, a series of reports that started last year and which “explores new forms of teaching, learning and assessment for an interactive world, to guide teachers and policy makers in productive innovation“. What makes this particularly interesting is that it has a UK focus and outlines some areas that are being actively discussed in UK universities, UCL included, but don’t feature (yet) in Horizons, though there is some overlap naturally.

The ten themes the OU pick this year are

  • MOOCs (of course)
  • Badges to accredit learning – an open framework for gaining recognition of skills and achievements
  • Learning analytics i.e. data-driven analysis of learning activities and environments
  • Seamless learning – connecting learning across settings, technologies and activities
  • Crowd learning – harnessing the local knowledge of many people
  • Digital scholarship – scholarly practice through networked technologies
  • Geo-learning – learning in and about locations
  • Learning from gaming –exploiting the power of digital games for learning
  • Maker culture – learning by making
  • Citizen inquiry – fusing inquiry-based learning and citizen activism

A fascinating and sometimes surprising list, and the report gives a quick overview of why the OU thinks these are or may be important and some links to further reading.

The potted Horizon Report

By Fiona Strawbridge, on 18 January 2012

Image by Steve Harris - http://www.flickr.com/photos/steveharris/3917314476/ Each year the ‘Horizon Report’ from the New Media Consortium tells us what’s up and coming in terms of technology in education.  The preview has been released (you’ll need to register but it’s painless) – the full report is out next month but this is a useful summary. Here’s a potted version:

What’s coming:

  • This year: mobile apps and tablet computing
  • 2-3 years: game-based learning and learning analytics
  • 4-5 years: gesture-based computing & the ‘internet of things’ (small network aware smart physical objects)

Trends:

  • Moving education from providing information to helping students evaluate & make sense of it
  • Shift from F2F to online learning, providing sometimes better learning environments than in physical campuses
  • Need for faster and easier access to academic and social networks, and focus on just-in-time and ‘found’ learning
  • Expectations of cloud-based and device-independent applications and services
  • Move to challenge-based and active learning often using smart devices to connect curriculum with real life problems
  • Move to more collaborative ways of working – collective intelligence wins out over silos.  Use of GoogleDocs, wikis, Skype etc for teamwork and communication with the tool having a role in ‘immortalising’ the process and participants’ perspectives

Challenges:

  • Finding appropriate evaluation metrics, beyond citations etc – things like re-tweeting, tagging, mentions in blogs, reader ratings
  • Developing digital literacy skills – for student and staff who may not realise that their students need their help
  • Competition and economic pressures driving creative approaches such as streaming of introductory courses – but there is a need to engage students on a deeper level too
  • Institutional barriers to engaging with technologies – innovation with technology seen as outside scope of academics’ roles
  • New modes of scholarship are challenging institutional libraries and research managers as students and researchers use alternative sources of information and tools

All in all a good read – looking forward to the full version.

Image: http://www.flickr.com/photos/steveharris/3917314476/

NUS “New Technology in Higher Education Charter”

By Clive Young, on 11 October 2011

Through surveys and focus groups it has become clear that students increasingly regard the UCL online environment as part of their learning experience. Indeed much of the drive towards ‘total Moodle’ at UCL comes from perceptions of student demand; “it’s what learners expect nowadays“. Nevertheless it is sometimes difficult to determine exactly what those expectations are.  It is sometimes – and quite understandably – challenging for students to describe their ‘model’ learning environment.

This may be set to change with the publication of the National Union of Students’ charter on Technology in Higher Education (2011), which has been developed after consultation with the sector and students. The charter, announced at the Future of Technology in Education (FOTE) conference last Friday by Emily-Ann Nash of the NUS  Higher Education Committee aims to set out “best practice for the use of technology in higher education, for teaching & learning and how technology can improve the student experience“.

The idea is that students and their respective student unions will be pushing to make sure technology is high on the institutional agenda so that “graduates are equipped and ready for the 21st century environment“.

The 10 points of the charter cover; clear ICT strategy, staff development, training and support for staff and students, accessibility, online administration, linking technology-enhanced learning and employability, investment in using technology to enhance learning and teaching, research into student demand and finally that technologies should enhance teaching and learning but not be used as a replacement to existing effective practice.

In a sense this is much what learning technologists across the sector have been lobbying for for years, but what is really significant is that the NUS here claims to articulate student expectations in quite a specific (and measurable) way. It will be interesting to see what effect it has on student demands.

The charter builds on the HEFCE commissioned NUS report Student Perspectives on Technology – demand, perceptions and training needs (2010).

What’s on the Horizon? The potted version

By Fiona Strawbridge, on 9 February 2011

The annual Horizon Report, published by the ‘New Media Consortium’ tries to predict which technologies are going to be important for higher education over the next 5 years. It’s actually a very good read.  The 2011 edition is just out now at: http://www.nmc.org/pdf/2011-Horizon-Report.pdf. Here’s a potted version:

Key trends, as in 2010, are:

  • Abundance of resources and relationships made accessible via the internet is challenging the roles of educators and institutions.
  • People expect to work wherever and whenever they want.
  • Increasingly collaborative world of work is prompting reflection on the nature of students’ projects.
  • Increasing use of cloud-based technologies and decentralised IT support.

Critical challenges:

  • Digital media literacy is a key skill for all, but not well-defined nor universally taught. Pace of change of technology exacerbates problem.
  • Difficulty of finding metrics for evaluating new forms of scholarly publishing.
  • Economic pressures and new educational models are challenging traditional models of the university.
  • Staff and students are struggling to keep up with pace of technological change , and with information overload.

The report describes the top six ‘technologies to watch’. Open content and visual data analysis have disappeared, and  new in this year are game-based learning and ‘Learning Analytics’ (which looks intriguing).  They have three ‘horizons’ and the report describes the technologies in detail – and points to case studies.

Technologies to watch in the near term (12 months)

  • E-books, e-readers with note-taking facilities, some augmentation of functions to allow immersive experiences and social interaction.
  • Mobiles – increasingly users’ first choice for internet access.

Technologies to watch in 2-3 years

  • Augmented reality – layering info on top of representation of the real world: access to place-based information.
  • Game-based learning, from simple individual/small group games to massively multiplayer online games – ability to foster collaboration, problem solving, procedural thinking.

Technologies to watch in 4-5 years

  • Gesture-based computing
  • Learning analytics, along with data gathering and analysis tools to study student engagement, performance and practice in order to inform curriculum & teaching design and enhance the student experience.