Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

    Subscribe to our elearning newsletters.

  • Subscribe to this blog

  • Meta

  • Tags

  • A A A

    Archive for the 'Rowett’s Ramblings' Category

    8th Jisc Learning Analytics Network

    By Stephen Rowett, on 7 November 2016

    The Open University was the venue for the 8th Jisc Learning Analytics Network. I’d not been there before. It was slightly eerie to see what was a clearly reconigsable university campus but without the exciting if slightly claustrophic atmosphere that thousands of students provide. I won’t report on everything, but will give some highlights most relevant to me. There’s more from Niall Sclater on the Jisc Learning Analytics blog.

    The day kicked off with Paul Bailey and Michael Webb giving an update on Jisc’s progress. Referring back to their earlier aims they commented that things were going pretty much to plan, but the term ‘learner monitoring’ has thankfully been discarded. Their early work on legal and ethical issues set the tone carefully and has been a solid base.

    Perhaps more clearly than I’ve seen before, Jisc have set their goal as nothing less than sector transformation. By collecting and analysing data across the sector they believe they can gain insights that no one institution could alone. Jisc will provide the central infrastructure including a powerful learning records warehouse, along with some standardised data transformation tools, to provide basic predictive and alerts functionality. They will also manage a procurement framework for insitutions who want more sophistication.

    The learning records warehouse is a biggie here – currently with 12 institutions on board and around 200 million lines of activity. Both Moodle and Blackboard have plug-ins to feed live data in, and code for mainpulating historic data into the right formats for it.

    Paul and Michael launched a new on-boarding guide for institutions at https://analytics.jiscinvolve.org/wp/on-boarding – A 20 step checklist to getting ready for learning analytics. Step 1 is pretty easy though, so anyone can get started!

    Bart Rientes from the Open University showed again how important learning analytics is to them and how powerfully they can use it. Mapping all of the activities students undertake into seven different categories (assimilative, finding and handling information, communication, productive, experiential, interactive/adaptive, assessment) gives dashboards allowing course designers to visualise their courses. Add in opportunities for workshops and discussion and you have a great way of encouraging thinking about course design.

    Interestingly, Bart reported that there was no correlation between retentition and satisfaction. Happy students fail and unhappy students pass, and vice versa. Which begs the question – do we design courses for maximum retention, or for maximum satisfaction, because we can’t have both!

    Andrew Cormack, Chief Regulatory Advisor at Jisc, gave an update on legal horizons. The new General Data Protection Regulations is already on the statute books in the UK but comes into force on 1 May 2018. For a complex issue, his presentation was wonderfully straightforward. I shall try to explain more, but you can read Andrew’s own version at http://www.learning-analytics.info/journals/index.php/JLA/article/view/4554   [I am not a lawyer, so please do your own due diligence].

    Much of the change in this new legislation involves the role of consent, which is downplayed somewhat in favour of accountability. This gives logic thus:

    • We have a VLE that collects lots of data for its primary purpose – providing staff and students with teaching and learning activities.
    • We have a secondary purpose for this data which is improving our education design, helping and supporting learners and we make these explicit upfront. We might also say any things that we won’t do, such as selling the data to third parties.
    • We must balance any legitimate interest they have in using the data collected, against any risks of using the data that the data subject might face. But note that this risk does not need to be zero in order for us to go ahead.
    • Andrew distinguished between Improvements (that which is general and impersonal, e.g. the way a course is designed or when we schedule classes) and Interventions (which go to an individual student to suggest a change in behaviour). The latter needs informal consent, the former can be based on legitimate interest. He also suggested that consent is better asked later in the day, when you know the precise purpose for the consent.
    • So for example in a learning analytics project, we might only obtain consent at the first point where we intervene with a given student. This might be an email which invites them to discuss their progress with the institution, and the act of the doing so gives consent at the same time.

    You can follow Andrew as @Janet_LegReg if you want to keep up with the latest info.

    Thanks to Jisc for another really good event, and apologies to those I haven’t written about – there was a lot to take in!

    A next generation digital learning environment for UCL

    By Stephen Rowett, on 7 November 2016

    At UCL we’ve been pondering what a future learning environment might look like now for about two years. And we are starting to reach some conclusions.

    Our analysis of our VLE – and pretty much all of them out there – is that it suffers from two fundamental limitations.

    Silos – staff and students see the courses they are enrolled for, and generally can’t look over the fence to see something else. In real life, if a student asked to attend lectures for a course they weren’t registered for, we’d welcome their interest, their breadth, their love of learning. In the VLE we tell them that this is impossible. The VLE limits a student’s education to just what they have paid for, just what they deserve, and just what they need to know. All curiosity is lost.

    Control – the teacher sets things up and students do them. No questions asked or even allowed. Forums lay devoid of posts for fear of asking ‘dumb’ questions, or fear of making mistakes. Assignments are submitted with perfunctory duty with the best that a student can hope for getting a green pass on Turnitin and some feedback some weeks later which is ignored anyway as the triumph or the disappointment of the grade awarded is processed. All love of learning is lost.

    So we’re looking for something different.

    And our inspiration came from an interest place – Brockenhurst College in Hampshire. Now they have a very rural catchment area – some students travel over from the Isle of Wight to attend classes. So of course, they don’t travel if they don’t have classes and therefore feel disconnected from the university.

    We realised that part of the challenge at UCL is the same. The distances may be much smaller, but when you are travelling from home or commuting on the Central Line means that the disconnection is just as real.

    So we need an environment that promotes connections. It just so happens we also have the Connected Curriculum initiative which will encourage interdisciplinary research-based education, where students do real, authentic work, not just essays for a teacher to mark. Where group work is the norm, not the exception. Where students are not passive recipients, but actively engaged in enquiry.

    So it’s all coming together. What we want for UCL is an Academic Social Network.

    What do I mean by that? Let’s take each word at a time.

    First, it’s Academic. That means it is designed for education. There are plenty of social networks around – Facebook, LinkedIn and Yammer spring to mind – but they are designed for different things, typically business. Whether it means allowing people to ask questions anonymously, embedding LaTex in messages so mathematicians can speak in their own language, or structuring data to be able to find final-year projects, the platform needs to speak to teachers and students as being something for them. It’s about work, but also all of the other things that happen at university; social clubs, sports, societies, volunteering. It’s a safe and trusted place to be because the user trusts the university and knows they are not the product to be sold and re-sold to the highest bidder.

    It’s Social. Because learning is social. I don’t just mean group work, but the full gamut of human social interaction. If you talk to students in our learning spaces, they are often working ‘alone, together’; that is they are doing individual tasks but just looking after each other. A student who is tired will be offered a coffee; someone will look after your laptop while you go to the toilet. Students are friends with each other on Facebook, but having staff friends is just ‘weird’. We want a space without complex meanings or difficult relationships but where everyone can connect with each other as part of the university community.

    Finally it’s a Network. Universities are big places, and UCL is bigger than most. Networks are a place where you can meet like-minded folk, but also get exposure and understanding of those who study different things, think in different ways, have different approaches to the same challenge. That network extends beyond current staff and students to pre-entry students, alumni, industry and charity partners – all of those that have a stake in the vibrancy and excitement of what a university can be.

    So what are we going to do?

    We’re going to get one.

    That’s quite a lot of work, as we have to do a lot of procurement activities to get what we want.

    But for now, we have students and teachers on the ground talking to peers, understanding needs, working our what it means to be part of the UCL community.

    We’ve done a lot of thinking, some talking and even more listening. It’s an experiment. We don’t know if it will work. Even if it does, it will probably take many years.

    We characterise what we want as follows:

    Characteristics of our platform

     

    It’s our shot at what a Next Generation Digital Learning Environment will look like.

    Many thanks to Eileen Kennedy for her work in developing and evaluating these ideas within UCL Digital Education.

    Sharing data, sharing experiences

    By Stephen Rowett, on 26 April 2016

    The thing we love most in Digital Education is working with our students, hearing their ideas and seeing what they can achieve.

    The UCLU Technology Society recently approached us about an API for accessing information at UCL – what is available now and what more could be made available in the future. It’s a difficult question, as we don’t own most of the data we process and don’t have the right to just make it available. But with the UK topping the Open Data Barometer, it’s such to be a question we will have to face.

    The leaders in this field in the UK are Southampton and Oxford. Four members of TechSoc and myself recently visited Ash Smith and Chris Gutteridge at Southampton, who have done tons of work in opening up university data – everything from buildings to catering.

    The TechSoc have written up the visit as a blog post, so I’ll do no more other than link to their report of the visit.

    From Bricks to Clicks: the potential for learning analytics

    By Stephen Rowett, on 9 February 2016

    I’ve blogged previously about the work that Jisc are doing in the field of learning analytics. Whilst there are some good case studies within the sector, informal conversations have indicated that most institutions are really only at the start of their analytics journey, or even simply keeping a watching brief on how the sector as a whole will act. Where institutions do have systems in place, they are often based on quite limited data sources (typically attendance data, VLE usage or library usage) rather than more holistic data sets covering a range of student experiences.

    A comprehensive picture of the current state of play is provided by From Bricks to Clicks: the Potential of Data and Analytics in Higher Education, a Higher Education Commission report which summarises the field and provides recommendations to institutions. A small number of pioneering institutions (Nottingham Trent, Open, Edinburgh) feature heavily as case studies, but the general argument is that universities are generating significant amounts of data about learning but are not yet in a position to use this data to support student success.

    At UCL, early discussions around the use of analytics have started. Our retention rates are generally good, but there is a feeling that students may leave their course due to social or economic factors – perhaps living in poor accommodation, feeling isolated, having financial difficulties or commuting into London. We think we might need quite a large dataset to model these parameters (if they can be modelled at all) although it is possible that attendance would be a good proxy for them. Certainly our journey into learning analytics is only just beginning.

    Are we using technology effectively to support student employability?

    By Stephen Rowett, on 19 January 2016

    Employability is something of the elephant in the room in higher education. We dream of students enthralled at learning new knowledge, making discoveries of their own as they develop their curiosity and strengthening their identities as they work with others.

    For many of course, the reality is that they are undertaking their programme of study to get a good ‘job’ at the end. I use quotation marks because the nature of the ‘job’ may be wide and varied: it might be traditional employed work; self-employment; voluntary work; portfolio working; or a combination of these.

    Jisc Technology for Employability report

    Jisc has been exploring the role that digital technologies, and the digital literacies needed to use them effectively, can play in developing employability. Peter Chatterton and Geoff Rebbeck have recently produced a detailed report on the topic on behalf of Jisc. They argue that technology is often woefully underexploited when it comes to giving students the opportunity to develop their professional skills and that both staff and student skill development will be necessary to close this gap.

    An introduction to the report is available or you can download the full report from the Jisc website. A webinar summarising the report will be held on 25 January 2016, with free registration.

    Update on Jisc Learning Analytics

    By Stephen Rowett, on 23 October 2015

    On Monday 19th November Steve Rowett attended the 4th Jisc Learning Analytics Network meeting in Bradford.  Jisc have been running their leaning analytics  R&D project for just over a year now and their plans are really starting to take shape. The aim is to provide a learning analyitics service for the HE/FE/skills sector that – at least at a basic level – institutions can take off the shelf and start using. There are also likely to be more sophisticated premium offerings from the vendors involved. The components of this solution are becoming well defined and I report on these below.

    The architecture of their system looks quite complex, but focuses on different elements for storing, processing and displaying back data to various audiences. For many of the components they are working with multiple vendors to offer alternative options.

    Already produced are a number of documents including a state of play review, literature review and code of practice – all available from http://analytics.jiscinvolve.org/wp/

    Now, the technical systems are in development. Michael Webb from Jisc focused on systems for student and staff use rather than back end processing, and include:

    The staff dashboard or ‘student insight tool’

    This is currently 98% ready and due to be launched in November 2015. The tool has a simple purpose: to predict withdrawals and dropouts amongst the student cohort. The system is pre-programmed with 3 years of student data which it analyses to predict the metrics that increase probabilities of withdrawal. It then applies this model to current trends. 

    The interface starts at programme level, and allows drill-down to modules and then to individual students giving a ‘withdrawal risk score’ for each. The types of data that that system can handle are:

    • enrolment and demographic data (essentially things the student can do nothing about): previous exam results, education history, school/college, demographic data
    • engagement (things the student can change): library/campus visit activity, books or journals used, VLE data, attendance data, student society membership, submission time before deadlines
    • academic performance (the outcomes): coursework and exam performance.

    Current estimates are that the predictions are about 70% accurate although this will need to validated by early institutions and maybe improve as the models get further developed.

    Alert and Intervention system (the ‘Student Success Planner’)

    Currently 70% ready, expected to launch in Q1 2016. This system provides alert to nominated contacts (typically a student’s personal tutor or welfare officer) based on pre-determinted criteria. The alert is reviewed and any appropriate intervention can then be taken. It is expected that around 10 criteria would be sufficient for most cases. At the moment the focus is on intervening to support students in difficulty, but it could also be used to offer reward or praise mechanisms for behaviour likely to lead to successful outcomes. 

    Learning records warehouse (‘Learning locker’)

    Nearly complete, launch imminent. Back end system which stores learner activity in a simple but powerful form: X verb Y at time Z. For example ‘Paul visited library’ or ‘Irrum completed coursework’. This provides a the data used to build the models

    Student app 

    Currently still being designed. Borrowing concepts from fitness apps the student app provides feedback (“you are in the top 25%” etc.) and encouragement, allowing students to set targets and monitor progress (“I will complete 10 hours reading this week”). Also allows self-declared data such as demographics or even general happiness to be added.


    The presentations from the event will shortly be on the Jisc Learning Analytics blog, and they have also developed some Moodle courses with more information – self register and find the LA-prefixed courses.

    My thoughts were that Jisc is moving quickly here, and these products are really taking shape. However much of the day focused on the legal, cultural, ethical issues around learning analytics; the technical readiness of our institutions to integrate so many different data sources; and the (perhaps) high levels of variability in the types of models that would best predict outcomes in very different institutions. The technology looks fairly straightforward, but every institution that wishes to adopt these tools will need to choose their own focus and rules of engagement that works best for them and their students.

    Update: Notes and presentations from this event are now available, along with a fuller description of other presentations and activities during the day.