Digital Education team blog
  • We support Staff and Students using technology to enhance education at UCL.

    Here you'll find updates on institutional developments, projects we're involved in, updates on educational technology, events, case studies and personal experiences (or views!).

  • Subscribe to the Digital Education blog

  • Meta

  • Tags

  • A A A

    Archive for the 'Our Views' Category

    TechQual+ Survey at UCL

    By Moira Wright, on 13 October 2017

    In early 2016, ISD (Information Services Division) carried out the first Staff and Student IT Survey using TechQual+. Over 1,000 of you completed the survey, and over the past 16 months we have been working hard to improve our services in response to your comments.

    Below are just a few examples of changes that have been made as a result of the feedback received from the TechQual+ survey run in 2016:

    Wi-Fi                        Three speech bubbles

    A substantial investment in replacing and upgrading our Wi-Fi technology infrastructure

    Service Desk

    We’ve invested in staffing, tools and training to speed up response times and improve quality.

    We’ve partnered with an external organisation and altered shift patterns to provide additional out of hours’ support.

    Printing                 

    We’ve rolled out 170+ additional printers over the past 18 months, targeting the busiest areas. This takes the current total to 660 printers. In areas of high usage, we’ve introduced new high capacity printers.

    Infrastructure

    We have invested in storage and now all staff and students can store 100GB for free.

    Computers

    We are continuing to invest in additional cluster PCs, and loan laptops where there isn’t space for desktops. We added a further 550 desktops and 60 laptops by September 2017.
    We operate one of the largest laptop loan services across UK universities – 266 laptops across 12 locations – and this year a further 60 laptops were added.

    Training

    We delivered 221 courses last academic year, that’s nearly 1000 hours of training with about 3000 people attending.  We are working hard to publicise the courses we offer.

    Audio Visual

    In 2016 ISD invested £2.5m into improving the technology in teaching facilities. Approximately 70 centrally bookable spaces had their facilities updated; this included bringing 43 spaces in 20 Bedford Way up to the standard spec including installation of Lecturecast in approx. 30 spaces.  Lecturecast was also installed at 22 Gordon Street and Canary Wharf (3 spaces each).  We also refreshed the Lecturecast hardware in 12 rooms.


    Drawing of a tablet with 5 stars

    Based on the findings of focus groups at participating institutions, the TechQual+ project has articulated a set of generalised IT service outcomes that are expected of IT organizations by faculty, students, and staff within higher education. The TechQual+ core survey contains 13 items designed to measure the performance of the following three core commitments: 1) Connectivity and Access, 2)Technology and Collaboration Services, and 3) Support and Training.

    The TechQual+ survey will be run again at UCL in December 2017 and we’ll be asking for your help to advertise it to your students, encouraging them (and you!) to complete it. All respondents will be entered into a prize draw with a chance to win some great prizes!

    We’ll be providing more information and communications about the survey closer to the opening date.

     

    Sneak a peak at the new (more accessible) UCL Moodle theme

    By Jessica Gramp, on 9 October 2017

    As part of a wider Accessible Moodle project, a new UCL Moodle theme is being designed to make it more accessible for those with disabilities. The new theme will be rolled out to all staff and students in the next major upgrade of UCL Moodle in summer 2018. However, we plan to pilot the new theme with students and staff beforehand and once we are confident it works as intended, we will give everyone the option of switching to the new theme in advance of it becoming the default theme for UCL Moodle.

    The theme will change the look and feel of all Moodle pages and provide additional navigation aids in the form of menus, blocks that can be hidden and potentially also docked blocks, which sit to the left of the page for easy access. The Moodle theme is applied to a user account, which means during the pilot period, there will be a mix of some using the new and some using the existing UCL Moodle theme. In Summer 2018 everyone will be switched to the new theme automatically as part of the UCL Moodle Summer Upgrade. The theme is not to be confused with Moodle course formats, which allow you to change the way a Moodle course is laid out.

    I wrote earlier on how the new theme will address accessibility issues. A number of staff across UCL provided feedback on the proposed theme and after a number if iterations, we have now agreed on a design that foremost meets the needs of staff with particular disabilities, as well as being more usable for everyone. As well as working with individuals who participated in the project’s initial focus groups, the E-Learning Champions were also given the opportunity to feed in their comments on the proposed theme and forward this to interested colleagues.

    The proposed new UCL Moodle theme showing collapsed topics format

    The proposed new UCL Moodle theme showing collapsed topics format. Click to enlarge.

    We had contemplated a pink theme, however, blue proved to be a better option for a number of staff with particular disabilities. The blue version was also more popular with those staff without disabilities. The below design shows how the tabbed course format will look, but with blue, instead of pink tabs, menus and links.

    Tabbed course format but the pink tabs, text and menus will be blue

    Tabbed course format but the pink tabs, menus and links will be blue. Click to enlarge.

    The UCL Moodle homepage will be simplified and will provide more space for news relating to teaching and learning at UCL. The menus will be blue instead of the pink shown in the design below.

    New more accessible UCL Moodle homepage, but with blue instead of pink menus

    UCL Moodle homepage, but with blue instead of pink menus. Click to enlarge.

    The Accessible Moodle project team at UCL worked closely with designer Ralph Bartholomew from St Albans Web Design and developer Pat Lockley from Pgogy Webstuff to implement the new theme.

    If you have any questions or comments about the new theme, or would like to be involved in the pilot, please contact Jessica Gramp.

    Jisc student digital tracker 2017 and BLE consortium – UCL report available

    By Moira Wright, on 11 September 2017

    markus-spiske-221494The UCL report on the data collected from the Jisc student digital tracker survey (see my previous post on this)  is now available.  The survey was jointly conducted by Birkbeck, LSHTM, RVC, SOAS and UCL back in March. Following a workshop in July, and using the Jisc national survey results as a benchmark, we have been able to make some conclusions and recommendations regarding the digital experiences of our students, based on the survey responses.

    You can read more about the BLE consortium in the ‘Jisc Insights from institutional pilots 2017’ report on page 18

    http://repository.jisc.ac.uk/6671/1/Tracker2017insights.pdf

    Please note Appendix C is available on request (moira. wright @ ucl.ac.uk)

    Download (PDF, 820KB)

    Download (PDF, 98KB)

    Download (PDF, 246KB)

     

    What I saw at ALTC 2017

    By Mira Vogel, on 8 September 2017

    I’ve been at ALTC , the Association for Learning Technology Conference 2017. To come, a harder piece to write where I make sense of it all – but for now I’m going to summarise each session I attended, mainly because I really enjoyed hearing from everyone else about what they went to. Incidentally, the keynotes and all of the sessions which took place in the largest room are available to watch on ALT’s YouTube (where there will hopefully be a playlist in due course).

    Day 1

    Bonnie Stewart, a keynote speaker from a non-traditional background, spoke about the exclusions which ensue from only planning for norms. Among many insights she shared was Ronald Heifetz’s about actively distinguishing between problems which technology can solve and problems which require humans to adapt their behaviour.

    Helen Walmsley-Smith introduced eDAT, a tool for analysing the content of online learning activity design. The data  could then be analysed with feedback and retention data to allow a learning design to be evaluated, and successful types in different contexts to be identified. eDAT is freely available. There are early signs that interactivity is related to improved retention.
    Emma Mayhew and Vicki Holmes from Reading described the shift from paper-based to digital assessment processes. Part of a major programme of EMA funding. With eight academic and student secondees, they aim to improve each part of cycle, from better awareness at the ‘Setting’ stage to better monitoring of progress at the ‘Reflection’ stage. They found that the idea of ‘consistency’ was problematic and might refer to satisfaction rather than practices. Their review of other institutions found that the most successful outcomes were in institutions which consulted carefully.
    Peter Alston (Liverpool) discussed how ‘the academy’ does not mean the same thing when it discusses e-assessment. This highlighted the differences between professional services and academic perspectives. Adopting Whitchurch’s (2008) ‘third space’ approach, and the contestation, reconciliation and reconstruction (Whitchurch 2010) around practices, rules, regulations and language.
    Why are the rates of e-submission and feedback at the University of Essex so high? Ben Steeples looked back at a decade of electronic submission and feedback on a platform built in-house, which designed out a number of problems affecting other platforms. Maintaining the in-house system costs £75k a year, but the integrations with e.g. calendar and student records are excellent and the service is very reliable. They expect to develop analytics. I love hearing from in-house developers making large strategically important institutional systems which work well.
    Daniel Roberts and Tunde Varga-Atkins #1637 discussed the minimum standards (‘hygiene factors’) for Liverpool’s VLE, and the development of an evaluation model involving students which could be used with other initiatives. Students are a transient presence who can be hard to reach; different evaluation approaches to involving them included as auditors and in focus groups. Between staff and students at Liverpool there was little mutual recognition of the respective effort which goes into using the VLE.
    One of the stand-out sessions for me, Simon Thomson and Lawrie Phipps summarised Jisc’s #Codesign16 consultation on needs for a next-generation digital learning environment. There was a sense that the tools drive the pedagogy, that they exist to control the academy, and that administration processes were de facto more important than education. Jisc found that students were using laptops and phones had almost equally (only 40% used a tablet). Students arrive at university networked, but the VLE currently stands alone without interfacing with those networks. At Leeds Beckett PULSE (Personalised User Learning and Social Environment) set out to address this by letting individuals connect spaces where they had existing relationships, allowing them to post once and selectively release to multiple places. The data within PULSE is entirely owned by students. When they leave, they can take it with them. Unsurprisingly, student’s expressed no strong desire to integrate personal tools with uni platforms – as ever, educators needs to design use of PULSE into the curriculum. However, the VLE vendor would not give access to the APIs to allow the kind of integration this would require.
    Helen Beetham and Ellen Lessner introduced video accounts of learning digitally from 12 students not all of whom loved technology. The institutional technologies do not come out well in Jisc’s ‘Student digital experience tracker 2017’, but we have no idea whether that is to do with the task design, the support for new ways of learning, or the technologies themselves. Find resources at bit.ly/ALTC17digijourneys.
    Carina Dolch asked whether students are getting used to learning technology. She described the massification and diversification of Germany’s higher education system, and how students’ media usage was changing over time. A survey of 3666 students confirmed that while there was an increase in time spent online since 2012. However – which is hard to explain – the frequency of text media use has been decreasing, as did the use of both general tools (search engines, Skype, etc) and e-learning tools and services (Moocs, lecture recordings, etc). Non-traditional students tend to use technologies functionally tied to their institution, whereas traditional students tended to use technologies more recreationally. Students expressed reluctance to be at the forefront of innovations, and there were more active decisions to be offline.

    Day 2

    I loved Sian Bayne’s keynote about anonymity. She used the demise of Yik Yak the anonymous hyperlocal networking app, to talk about campus networks and privacy. Yik Yak’s high point in the download chart was 2014. In 2016 they withdrew anonymity, which is reflected by a plunge in usage at Edinburgh. Yik Yak restored anonymity shortly before closing in 2017 to no particular regret in the media. It had not been able to use personal data to finance itself. Moral panics about anonymous social media served platform capitalism by demanding that everyone be reachable and accountable. Edinburgh students discussed student life (including mental health), sex and dating, with some academic and political issues. Most students found it a kind and supportive network. Anonymity studies notes the ‘psychic numbing’ which allows most social media users to join up their accounts in the interests of living an “effective life”, inuring them to the risks of surveillance capitalism. Some users resist surveillance by cloaking one’s identity – however this seems over-reliant on other users not cloaking theirs, otherwise the enterprise, relying as it does on personal data, inevitably folds. I can’t see any other way to escape platform capitalism than to organise sustainable resourcing for open platforms such as Mastodon and Diaspora.
    Fotios Mispoulos took a University of Liverpool instructor’s perspective on the effectiveness of learner-to-learner interactions. Most of the research into learner-to-learner interactions happened in the 1990s and found improved satisfaction and outcomes, though there are some counter findings. As usual the particulars of the task design, year group etc were glossed so we may be trying to compare apples and bananas.
    Vicki Holmes and Adam Bailey talked about introducing Blackboard Collaborate Ultra (which we have at UCL) for web meeting at Reading. I thought their approach was very good – to clarify purposes and promote commitment hey asked for formal expressions of interest, they then ran workshops with selected colleagues to build confidence and technical readiness (headphones, the right web browser). These refined designs for meetings around placement support, sessions between campuses, assessment support tutorials, and pre-session workshops, among other purposes. Participants from Politics, Finance, Careers observed positive outcomes. Recommendations include avoiding simply lecturing since students disengage quickly,  designing interactions carefully (rather than expecting them to happen), to develop the distinct presentation techniques, and to prepare students (again around technical readiness and role). 87% of students felt it was appropriate to their learning.
    Beth Snowden and Bronwen Swinnerton presented on rethinking lectures in three redesigned tiered theatres at the University of Leeds. Each ‘pod’ has a mic, top-lighting, and a wired-in thinkpad device which can be used to send responses and also to present via the data projector. Lecturers observed how students who had chatted to each other were more likely to chat with him and to ask questions. Another doubted he could continue referring to the session as a ‘lecture’. Responses to the evaluation survey found that the average time listening to the lecturer was 49%, which was assumed to be less than in the other lecture theatres. Just over half of staff felt that the new lecture theatres created extra work, but more felt they were a positive development. Future evaluation will focus on educational uses.
    [See YouTube University of Leeds “upgrade of teaching spaces”]
    Catherine Naamani looked at the impact of space design on collaborative approaches at the University of South Wales. The flexible spaces had colour coded chairs round triangular tables with their own screen which students could present to using an app, and which the tutor could access. The more confident groups gained more tutor attention while the least engaged groups tended to be international students, so more group-to-group activity needed to be designed. Staff tended to identify training needs with the technology, but not developmental needs around educational approach using that technology.
    Another stand-out session – as digital education strategists and academics at their respective institutions, Kyriaki Agnostopoulou, Don Passey, Neil Morris and Amber Thomas looked at the evidence bases and business cases for digital education. Amber noted academic, administrative and technical don’t speak to each other until the top of the organisation. How do digital education workers influence their organisations strategies? There are four distinct origins of evidence: technology affordances, uses, outcomes and impact. The former kinds of evidence can be provided through qualitative case studies while the latter through quantitative independent control group studies. Case studies are abundant, but far rarer are studies which show evidence of impact over time. Amber urged us to learn the language of ITIL and Prince 2 to “understand them as much as you want them to understand you”. Return on investment, laying out true costs (staff time, supply costs, simultaneous users), use cases (and edge cases), capital spend and recurrent spend) strategic alignment, gains (educational, efficiency and PR), options appraisals, sustainability and scalability, and risk analyses are a way to be ready for management critique of any idea. Neil Morris (Leeds) took the view that using evidence is the most powerful way of making change. Making the academic case first gets the idea talked about.
    Online submission continues to outstrip e-marking at the University of Nottingham. Helen Whitehead introduced ‘Escape from paper mountain‘, an educational development escape game through which staff would understand how to use an online marking environment [see ALT Winter Conference]. The scenario is an assessor who has completed his marking but then disappeared; the mission is to find his marking and get it to the Exam Board in 60 minutes. The puzzles, to be solved in groups, are all localised, sometimes even at the subject-specific level. There are plenty of materials at yammer.com/escapehe.
    Kamakshi Rajagopal from the Open University of The Netherland ran a workshop on practical measures to break out of online echo chambers aka filter bubbles – people from similar backgrounds and strata of societies in the context of an egocentric, personally and intentionally created personal learning network. One group came up with the idea of a ‘Challenge me’ or ‘Forget me’ button to be able to serve yourself different feeds

    Day 3

    (The amount of notes reflects the amount of sleep).
    Peter Goodyear’s keynote was very good. He talked about the designing physical spaces for digital learning, which he called ‘multidimensional chess’. He introduced these as apprentice spaces where students learn to participate in valued practices. While STEM subjects require a lot of physical infrastructure, arts, humanities and social sciences require cognitive structures to learn to use knowledge and work with others. Designers reduce complexity by concentrating on what learners will do in the spaces. The activities themselves are not designable, but the guides and scaffolds are. Active learning risks cognitive overload due to the mechanics of the tasks – the instructions, navigating the task. The activity-centred activity design framework set out how to mitigate this.  Find the slides at petergoodyear.net.
    John Traxler described initial thoughts about an Erasmus+ project to empower refugee learners  from Middle East and North Africa through digital literacy. Few Moocs are oriented to refugees, and those which are depend on the availabilities of volunteers. Engaging in a Mooc obviously depends on digital access and capabilities. Other challenges include language, expectations and cultural assumptions. Digital literacy can be interpreted as employability skills, or alternatively with a more liberal, individualistic definition to do with self-expression. The group is very hard to reach, so it is hard to carry out a valid needs assessment. The project is moonlite.
    Lubna Alharbi talked about emotion analysis to investigate lecturer-student relationship in a fully online setting. Emotions which interfere with learning include isolation and loneliness arising from lack of interaction. To motivate students it is very important for the tutor to interpret and react to emotions. The International Survey on Emotional Antecedents and Reactions (ISEAR) dataset consists of sentences related to different emotions. Synesketch tool.
    Another stand-out, Khaled Abuhlfaia asked how the usability of learning technologies affects learners. In usability research, usability is conceived as effectiveness, efficiency, learnability, memorability, error handling and satisfaction. The literature review was very well reported, and he found that there is far more evidence about the effectiveness, efficiency and satisfaction dimensions mostly questionnaires and interviews, while the other dimensions, while important, have been neglected.
    Academic course leaders choose textbooks in a climate of acute student worries about living costs (not to mention the huge debts they graduate with). Viv Rolfe, David Kernohan and Martin Weller compared open textbook use in the UK and the US. In the US open textbook use has been driven by student debt – and in the UK nearly 50% of students graduating in 2015 had debt worries.
    Ian McNicoll talked about the learning technologist role as a ‘fleshy interface’ between educators (who view LTs as techies), technies (who view LTs as quasi-academic), students (as helpdesk staff) and the institution (as strategic enablers).
    John Tepper and Alaa Bafail discussed ways to calibrate designs for learning activities in STEM subjects. These are currently tied to outcomes statements, where outcomes are constructivist – teachers create a learning environment supportive of learning activities appropriate to the outcomes. Quality was operationalised as student satisfaction, which I thought might be problematic since it does not itself relate to outcomes. I also wondered about the role of context for each activity e.g. demographic differences, level which I missed in the talk. The presenters took a systems approach to evaluating quality, through which designs which elicited high student satisfaction were surfaced. Anyone interested in designing educational activities will probably be interested in Learning Designer, which was mentioned in the talk, is really good, and is still being maintained. It’s increasingly rare for software developers to talk at ALTC, so it was good to hear about this. I found this talk fascinating and baffling in equal measures, but fully intriguing.
    Sam Ahern discussed learning analytics as a tool for supporting student wellbeing. One fifth of all adults surveyed by the NHS have a longterm common mental health problem, with variation between demographic groups. The numbers reporting mental health problems on entry has jumped 220% as students numbers have climbed. Poor mental health manifests as behaviour change around attendance, meeting deadlines, self-care and signs of frustration. Certain online behaviours can predict depressive episodes.

    Jisc student digital tracker 2017 and BLE consortium

    By Moira Wright, on 10 August 2017

    computer-767776_1920UCL participated in the 2017 Jisc Digital Student Tracker Survey as part of a consortium with the Bloomsbury Learning Environment (BLE) made up of SOAS, Birkbeck, LSHTM and RVC. 74 UK institutions ran the tracker with their students collecting 22,593 student responses, while 10 international universities collected an additional 5,000 student responses

    We were the only consortium to participate in the survey and had come together as a result of institutional surveys, such as the National Student Survey, meaning that the time available to run it independently was short (a month) and we therefore felt that our individual sample sizes would be too small. We treated the survey as a pilot and advertised a link to it on each College’s Moodle landing page as well as some promotion via social media and the Student Unions. The survey generated 330 responses, which given our constraints was much more than we expected.

    The survey comprises five broad areas: Digital access, digital support and digital learning. Most questions were quantitatively recorded, but there were four open questions, which produced qualitative data. We were also able to choose two additional questions to the survey and we selected e-assessment, since that was a previous shared enhancement project (see www.bloomsbury.ac.uk/assessment) and Moodle, since all members of the consortium use the platform for their Virtual Learning Environment (VLE).

    Once the survey closed and we had access to the benchmarking report we ran a workshop for representatives from each of the Colleges in July 2017 whereby the results corresponding to the survey’s open questions were analysed in institutional groups, which facilitated interesting discussions over commonalities and potential implications.

    Sarah Sherman, the BLE Manager and myself, have been working to produce a report which will examine our collective responses to the survey in comparison with the national survey population with a recommendation that individual Colleges independently analyse their own results in more detail. For confidentiality, each College will be presented with a version of this document, which contains the relevant data for their institution only and not the complete BLE data set. A disadvantage of the consortium approach was that we were not able to benchmark individual Colleges to the survey population as the resources would not allow for this. In the future, the participating Colleges may wish to run the survey individually rather than as part of a collective as it was not possible to conduct deep analysis with this data set. 

    markus-spiske-221494

    Although the sample size collected by the Bloomsbury Colleges was small and not statistically viable, there is much we can extract and learn from this exercise. For the most part, our collective responses tended to fall within the margins set by the national survey population, which means we are all at a similar phase in our student’s digital capability and development.

    You will have to wait for the full report for more information on the UCL data collected but just to whet the appetite you can see the key findings from Jisc in this 2 page report: Student digital experience tracker at a glance .

    Finally, you can see this collection of case studies, which features the Bloomsbury Colleges consortium, here.

    Please get in touch with me if you would like to get involved (moira.wright @ ucl.ac.uk)

    Sarah Sherman and Moira Wright

    Jisc/ NUS student digital experience benchmarking tool 

    Jisc guide to enhancing the digital student experience: a strategic approach

     

    Assessment in Higher Education conference, an account

    By Mira Vogel, on 25 July 2017

    Assessment in Higher Education is a biennial conference which this year was held in Manchester on June 28th and 29th. It is attended by a mix of educators, researchers and educational developers, along with a small number of people with a specific digital education remit of one kind or another (hello Tim Hunt). Here is a summary – it’s organised it around the speakers so there are some counter-currents. The abstracts are linked from each paragraphy, and for more conversation see the Twitter hashtag .

    Jill Barber presented on adaptive comparative judgement – assessment by comparing different algorithmically-generated pairs of submissions until saturation is reached. This is found to be easier than judging on a scale, allows peer assessment and its reliability bears up favourably against expert judgement.  I can throw in a link to a fairly recent presentation on ACJ by Richard Kimbell (Goldsmiths), including a useful Q&A part which considers matters of extrapolating grades, finding grade boundaries, and giving feedback. The question of whether it helps students understand the criteria is an interesting one. At UCL we could deploy this for formative, but not credit-bearing, assessment – here’s a platform which I think is still free. Jill helpfully made a demonstration of the platform she used available – username: PharmEd19 p/ wd: Pharmacy17.

    Paul Collins presented on assessing a student-group-authored wiki textbook using Moodle wiki. His assessment design anticipated many pitfalls of wiki work, such as tendency to fall back on task specialisation, leading to cooperation rather than collaboration (where members influence each other – and he explained at length why collaboration was desirable in his context), and reluctance to edit others’ work (which leads to additions which are not woven in). His evaluation asked many interesting questions which you can read more about in this paper to last year’s International Conference on Engaging Pedagogy. He learned that delegating induction entirely to a learning technologist led students to approach her with queries – this meant that the responses took on a learning technology perspective rather than a subject-oriented one. She also encouraged students to keep a word processed copy, which led them to draft in Word and paste into Moodle Wiki, losing a lot of the drafting process which the wiki history could have revealed. He recommends lettings students know whether you are more interested in the product, or the process, or both.

    Jan McArthur began her keynote presentation (for slides see the AHE site) on assessment for social justice by arguing that SMART (specific, measurable, agreed-on, realistic, and time-bound) objectives in assessment overlook precisely the kinds of knowledge which are ‘higher’ – that is, reached through inquiry; dynamic, contested or not easily known. She cautioned about over-confidence in rubrics and other procedures. In particular she criticised Turnitin, calling it “instrumentalisation\ industrialisation of a pedagogic relationship” which could lead students to change something they were happy with because “Turnitin wasn’t happy with it”, and calling its support for academic writing “a mirage”. I don’t like Turnitin, but felt it was mischaracterised here. I wanted to point out that Turnitin has pivoted away from ‘plagiarism detection’ in recent years, to the extent that it is barely mentioned in the promotional material. The problems are where it is deployed for policing plagiarism – it doesn’t work well for that. Meanwhile its Feedback Studio is often appreciated by students, especially where assessors give feedback specific to their own work, and comments which link to the assessment criteria. In this respect it has developed in parallel with Moodle Assignment.

    Paul Orsmond and Stephen Merry summarised the past 40 years of peer assessment research as ’80s focus on reliability and validity, ’90s focus on the nature of the learning, and a more recent focus on the inseparability of identity development and learning – a socio-cultural approach. Here they discussed their interview research, excerpting quotations and interpreting them with reference to peer assessment research. There were so many ideas in the presentation I am currently awaiting their speaker notes.

    David Boud presented his and Philip Dawson’s work on developing students’ evaluative judgement. Their premise is that the world is all about evaluative judgement and understanding ‘good’ is a premise to producing ‘good’, so it follows that assessment should be oriented to informing students’ judgments rather “making unilateral decisions about students”. They perceived two aspects of this approach: calibrating quality through exemplars, and using criteria to give feedback, and urged more use of self-assessment, especially for high-stakes work. They also urged starting early, and cautioned against waiting until “students know more”.

    Teresa McConlogue, Clare Goudy and Helen Matthews presented on UCL’s review of assessment in a research intensive university. Large, collegiate, multidisciplinary institutions tend to have very diverse data corresponding to diverse practices, so reviewing is a dual challenge of finding out what is going on and designing interventions to bring about improvements. Over-assessment is widespread, and often students have to undertake the same form of assessment. The principles of the review included focusing on structural factors and groups, rather than individuals, and aiming for flexible, workload-neutral interventions. The work will generate improved digital platforms, raised awareness of pedagogy of assessment design and feedback, and equitable management of workloads.

    David Boud presented his and others’ interim findings from a survey to investigate effective feedback practices at Deakin and Monash. They discovered that by half way through a semester nearly 90% of students had not had an assessment activity. 70% received no staff feedback on their work before submitting – more were getting it from friends or peers. They also discovered skepticism about feedback – 17% of staff responded they could not judge whether feedback improved students’ performance, while students tended to be less positive about feedback the closer they were to completion – this has implications for how feedback is given to more advanced undergraduate students. 80% of students recognised that feedback was effective when it changed them. They perceived differences between indvidualised and personalised feedback. When this project makes its recommendations they will be found on its website.

    Head of School of Physical Science at the OU Sally Jordan explained that for many in the assessment community, learning analytics is a dirty word, because if you go in for analytics, why would you need separate assessment points? Yet analytics and assessment are likely to paint very different pictures – which is right? She suggested that, having taken a view of assessment as ‘of’, ‘for’ and ‘as’ learning, the assessment community might consider the imminent possibility of ‘learning as assessment’. This is already happening as ‘stealth assessment‘ when students learn with adaptable games.

    Denise Whitelock gave the final keynote (slides on the AHE site) asking whether assessment technology is a sheep in wolf’s clothing. She surveyed a career working at the Open University on meaningful automated feedback which contributes to a growth mindset in students (rather than consolidating a fixed mindset). The LISC project aimed to give language learners feedback on sentence translation – immediacy is particularly important in language learning to avoid fossilisation of errors. Another project, Open Mentor, aimed to imbue automated feedback with emotional support using Bales’ interaction process categories to code feedback comments. The SAFeSEA project generated Open Essayist which aims to interpret the structure and content of draft essays, identifies key words, phrases and sentences, identifies summary, conclusion and discussion, and presents these to the author. If Open Essayist has misinterpreted the ideas in the essay, the onus is on the author to make amendments. How it would handle some more avant-garde essay forms I am not sure – and this also recalls Sally Jordan’s question about how to resolve inevitable differences between machine and  human judgement. The second part of the talk set out and gave examples of the qualities of feedback which contributes to a growth mindset.

    I presented Elodie Douarin’s and my work on enacting assessment principles with assessment technologies – a project to compare the feedback capabilities of Moodle Assignment and Turnitin Assignment for engaging students with assessment criteria.

    More blogging on the conference from Liz Austen, Richard Nelson, and a related webinar on feedback.