X Close

Digital Education team blog

Home

Ideas and reflections from UCL's Digital Education team

Menu

Update on Jisc Learning Analytics

By Steve Rowett, on 23 October 2015

On Monday 19th November Steve Rowett attended the 4th Jisc Learning Analytics Network meeting in Bradford.  Jisc have been running their leaning analytics  R&D project for just over a year now and their plans are really starting to take shape. The aim is to provide a learning analyitics service for the HE/FE/skills sector that – at least at a basic level – institutions can take off the shelf and start using. There are also likely to be more sophisticated premium offerings from the vendors involved. The components of this solution are becoming well defined and I report on these below.

The architecture of their system looks quite complex, but focuses on different elements for storing, processing and displaying back data to various audiences. For many of the components they are working with multiple vendors to offer alternative options.

Already produced are a number of documents including a state of play review, literature review and code of practice – all available from http://analytics.jiscinvolve.org/wp/

Now, the technical systems are in development. Michael Webb from Jisc focused on systems for student and staff use rather than back end processing, and include:

The staff dashboard or ‘student insight tool’

This is currently 98% ready and due to be launched in November 2015. The tool has a simple purpose: to predict withdrawals and dropouts amongst the student cohort. The system is pre-programmed with 3 years of student data which it analyses to predict the metrics that increase probabilities of withdrawal. It then applies this model to current trends. 

The interface starts at programme level, and allows drill-down to modules and then to individual students giving a ‘withdrawal risk score’ for each. The types of data that that system can handle are:

  • enrolment and demographic data (essentially things the student can do nothing about): previous exam results, education history, school/college, demographic data
  • engagement (things the student can change): library/campus visit activity, books or journals used, VLE data, attendance data, student society membership, submission time before deadlines
  • academic performance (the outcomes): coursework and exam performance.

Current estimates are that the predictions are about 70% accurate although this will need to validated by early institutions and maybe improve as the models get further developed.

Alert and Intervention system (the ‘Student Success Planner’)

Currently 70% ready, expected to launch in Q1 2016. This system provides alert to nominated contacts (typically a student’s personal tutor or welfare officer) based on pre-determinted criteria. The alert is reviewed and any appropriate intervention can then be taken. It is expected that around 10 criteria would be sufficient for most cases. At the moment the focus is on intervening to support students in difficulty, but it could also be used to offer reward or praise mechanisms for behaviour likely to lead to successful outcomes. 

Learning records warehouse (‘Learning locker’)

Nearly complete, launch imminent. Back end system which stores learner activity in a simple but powerful form: X verb Y at time Z. For example ‘Paul visited library’ or ‘Irrum completed coursework’. This provides a the data used to build the models

Student app 

Currently still being designed. Borrowing concepts from fitness apps the student app provides feedback (“you are in the top 25%” etc.) and encouragement, allowing students to set targets and monitor progress (“I will complete 10 hours reading this week”). Also allows self-declared data such as demographics or even general happiness to be added.


The presentations from the event will shortly be on the Jisc Learning Analytics blog, and they have also developed some Moodle courses with more information – self register and find the LA-prefixed courses.

My thoughts were that Jisc is moving quickly here, and these products are really taking shape. However much of the day focused on the legal, cultural, ethical issues around learning analytics; the technical readiness of our institutions to integrate so many different data sources; and the (perhaps) high levels of variability in the types of models that would best predict outcomes in very different institutions. The technology looks fairly straightforward, but every institution that wishes to adopt these tools will need to choose their own focus and rules of engagement that works best for them and their students.

Update: Notes and presentations from this event are now available, along with a fuller description of other presentations and activities during the day.

3 Responses to “Update on Jisc Learning Analytics”

  • 1
    Matt Jenner wrote on 26 October 2015:

    Am I right in thinking the JISC work/tools is mostly aligned towards campus-based and blended learning undergraduate students? I’d expect it is difficult to build a profile of a postgraduate student who is only with a university for one year (typically). Part time, mixed-mode or highly blended/distance courses might produce data over a longer period of time but I expect the nature of that kind of flexibility only makes it more difficult to interpret?

    Regardless, this is a good start – I’m liking the idea of the learner records warehouse – a simple idea that should produce useful, reusable data.

  • 2
    Steve Rowett wrote on 29 October 2015:

    Hi Matt. I don’t think the tools are aimed at specific students groups, but the nature of the models used and the variables that are significant might well change between different student groups (clearly some measures like library barrier access will not be relevant to distance learners, but VLE engagement metrics might be more so).
    The thing about these profiles is that they are system-generated. You basically throw in a huge amount of data about previous years students and their outcomes and it fits the model to the outcomes, identifying which variables matter and which ones don’t. There’s not really a lot of human intervention to it.

  • 3
    From Bricks to Clicks: the potential for learning analytics | UCL Digital Education team blog wrote on 9 February 2016:

    […] blogged previously about the work that Jisc are doing in the field of learning analytics. Whilst there are some good […]

Leave a Reply