X Close

IOE Blog

Home

Expert opinion from IOE, UCL's Faculty of Education and Society

Menu

What works: researching the use of research evidence

By Blog Editor, IOE Digital, on 21 September 2016

Laurenz Langer, Jan Tripney, David Gough.
The use of research evidence to inform decision-making can make policies and practices more effective and relevant. From the US federal regulation on blood alcohol limits, to the design and funding of microfinance programmes in low- and middle-income countries, and the establishment of behavioral science units in public administrations (such as the UK’s ‘Nudge Unit’), research evidence has informed and continues to inform decision-making.
In England, evidence on best practice in health is harnessed by the National Institute of Health and Care Excellence (NICE) and is used in every hospital and GP surgery, and in education, the Education Endowment Foundation provides an evidence-based toolkit used by teachers and leaders.
At a time of intense public debate and polarised political environments, it is particularly important to raise the profile of the use of research evidence in public life. This makes next week’s What Works Global Summit in Bloomsbury (September 26-28) so timely. Presenters will share experiences (and research evidence) from around the world.
global_summit.jpg
If evidence-informed decision-making is desirable it seems justified to design and implement active interventions aiming to bring about the use of research evidence by decision-makers. For this reason, we set out to investigate what we know (and don’t know) about the efficacy of strategies attempting to achieve this aim.

‘So what exactly are the best ways of getting research used by decision-makers? Evidence rarely speaks for itself and you may have witnessed some impressive ways for research to get noticed and used. Maybe a high-level policy seminar, mentoring programme, or a journal club used by nurses. But do they really work? Our pet approaches to knowledge exchange may fail to deliver, and we need to evaluate if they really cause impact.’ (Breckon and Dodson 2016)

So, what works?
Based on existing systematic reviews that synthesise the findings of primary studies evaluating the effects of research use interventions, we identify three key groups of interventions that make a difference.

  • The first type increases decision-makers’ skills to access and make sense of using evidence. Evidence-informed decision making (EIDM) capacity-building, critical appraisal training and formal university courses were consistently found to increase the use of evidence by diverse groups of decision-makers, such as nurses, senior policymakers, and hospital administrators. However, this finding only held true if these interventions improved both the capability to use evidence (e.g. being able to appraise a research study for its reliability) and the motivation to use evidence.
  • The second effective intervention group improved communication and access to evidence. Evidence repositories and dissemination increased evidence use only if communication and access provided the opportunity as well as the motivation to use evidence. For example, an evidence database for health policymakers piloted in Canada was not effective until it was complemented with an SMS service that sent relevant, tailored and targeted messages to policymakers.
  • The third group aimed to embed strategies for the use of evidence and the mechanisms of change required for it in the routine working processes of decision-makers. For example, instead of just building decision-makers’ EIDM skills through training programmes, supervision structures were amended to monitor and reward the application of these skills in the daily practice.
  • Lastly, a few individual interventions characterized by a highly intense and complex design led to an increase of evidence use.

Evidence of no effects – where do we have to think more carefully about our intervention approach?
Passive dissemination or access to evidence alone did not increase research use. The same went for interventions that aimed to build EIDM skills but used methods such one-off seminars, or provision of training manuals rather than an explicit educational approach. Overall, simply bringing researchers and decision-makers together without an underlying logic model and facilitation of how this interaction leads to evidence was not enough to bring about change.
A key message lies behind our findings. The golden thread throughout the review is the centrality of the decision-maker, understanding her context, her decision-making processes, her biases, her motivation.
We will be discussing this research at our session at the WWGS. Please also see our other events at the Summit and our associated free public events here.
This project was led by the Alliance for Useful Evidence, with generous funding and support from the Wellcome Trust and the What Works Centre for Wellbeing. The research was undertaken by Laurenz Langer, Janice Tripney, and David Gough of the EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London. Our Science of Using Science project produced two project reports depending on the level of detail that the reader is interested in (accessible here and here). Jonathan Breckon and Jane Dodson of the Alliance for Useful Evidence further produced a Discussion Paper contextualizing the review findings as well as extending and translating them for a policy audience. They introduce our reports on the Science of Using Science and Using Evidence  – What Works;
 

Print Friendly, PDF & Email

Comments are closed.