Higher education’s X-Factor: everything you always wanted to know about the REF
By Blog Editor, IOE Digital, on 16 December 2014
Chris Husbands
Imagine – if you do not work in a UK university – a cross between the Olympics, the X-factor and a visit from Father Christmas. That will give you some – some – idea of the REF (the Research Excellence Framework), and its importance in academic life. The results of REF2014 are published this week. Around the country, vice-chancellors, pro-vice-chancellors for research, deans, heads of department will be looking anxiously – not just at their own results, but at their competitors. As Gore Vidal famously put it: “It is not enough to succeed. Others must fail”.
Research funding matters enormously to government, and to universities. For government, it is how new knowledge is generated, new science supported, innovations which will eventually strengthen national competitiveness developed. For universities, research is the lifeblood, motivating
academics and defining their purpose.
In the UK, the bulk of research funding is offered competitively, through bidding to research councils and charities, but the research infrastructure is funded through a grant – now called ‘QR’ (quality-related) funding. This system was developed in the 1980s; with public spending under pressure, the government needed a mechanism for allocating research funding effectively. Put crudely – how many high-tech science facilities could the country afford, and where should they be located?
Research assessment exercises (RAEs) were run in 1986, 1989, 1992, 1996, 2001 and 2008. At each exercise, ‘units of assessment’ (essentially, disciplines such as history, education or clinical medicine) were identified and universities were required to submit returns to every ‘unit of assessment [UoA]’ in which they wished to be assessed. The methodology developed over time but there have been two constants since 1989: first, for every UoA, universities have had to submit quantitative data on performance (how much funding they get, how many doctoral completions and so on) and secondly, academics had to submit research outputs, such as books and papers in academic journals, produced during the census period for quality assessment by a panel of senior academics. The results of the RAE did not grade individual academics, but their departments. League tables were produced for individual disciplines and for universities.
Research assessment matters, for every institution, department and academic. It generates league tables and has (big) funding consequences: QR funding distributes over £1.5 billion a year. For some disciplines, in fact, the reputational consequences of research assessment are more important than the funding itself.
Research assessment has driven the actions of university leaders for the past 25 years, Departments have been closed on the basis of poor assessment performance; successful departments have seen investment flow. Every institution wants to be able to describe itself as a ‘top ranked research university’ and, if it cannot do that, to have at least some departments which are ‘research leaders’.
So, before every census date, there is a flurry of academic hiring, and the price of research stars increases. The rewards – in ability to bid for big research grants, to attract international students and scholars – are huge. And this means that research assessment is gamed: there are research centres of genuine excellence and then there are places which present themselves as research centres of excellence, by being very selective about the research and information they enter.
Research assessment has attracted considerable criticism. Because research is more straight-forward to assess than teaching, the assessment exercises have skewed universities’ priorities strongly towards research. Its defenders argue that it has been a motivator for staff to be more research productive; its critics argue that it has encouraged senior managers to deploy crude target-based performance management, and has not necessarily encouraged worthwhile research. It has been pointed out that the entire structure of research assessment, with its committees and judgments, turns out to have just about the same overall outcome as looking at a spreadsheet of successful research grant applications, which could be compiled in an afternoon. Moreover, research funding in the UK is extremely concentrated – there are about 120 universities, but over 90% goes to just 48 universities, 65% to just 20 universities and something like a third to just five. Most universities get very little from research assessment.
The 2008 RAE was widely expected to be the last, the job of selectivity and concentration largely done. The Treasury believed that through clever use of routine performance measures the exercise could be done more efficiently and cheaply. But the universities rebelled: it might be difficult, painful, stressful; it might skew institutional priorities – but research assessment had become well established.
So a classic British compromise was reached. It was renamed the Research Excellence Framework [REF] and slightly re-designed, with a new element – ‘research impact’ on practice, policy or public opinion, assessed through case studies. Throughout 2014, REF panels have read the 191,232 individual outputs submitted by universities on behalf of 52,077 staff, giving each one a grade from 1* to 4*. These grades count for 65% of each department’s REF profile, with 20% awarded for impact and a further 15% for research environment.
It is now almost impossible to imagine UK university life without research assessment, and in all universities – including this one – planning for REF2020 is well underway; indeed, before the REF2014 results are published, with normal timetables for academic publication and grant getting, we are effectively halfway through the next census period. It has proved, also, a good export activity, with research assessment exercises spreading to other countries, including Norway, Hong Kong, Australia and New Zealand.
If you know any academics, this week – and the Christmas break which follows – will be the most important, and nerve-racking for six years. Be kind to them.
3 Responses to “Higher education’s X-Factor: everything you always wanted to know about the REF”
- 1
-
2
Graham Holley wrote on 17 December 2014:
I am a survivor of the first RAEs, as I was Secretary of the Arts and Social Studies Sub Cttees of the University Grants Committee at the time. The methodology has been refined over time, but it seems to me has not altered all that much. I wonder how many other Government exercises lasted as long..?
-
3
Preface, Spring 2017, Vol. 36, No. 1 wrote on 14 October 2019:
[…] X-Factor: Everything You Always Wanted to Know about the REF,” IOE London Blog, 16 December 2014, https://ioelondonblog.wordpress.com/2014/12/16/higher-educations-x-factor-everything-you-always-want…. 3 Husbands, “Higher Education’s X-Factor: Everything You Always Wanted to Know about the […]
Reblogged this on Educational Gems.