X Close

IOE Blog

Home

Expert opinion from IOE, UCL's Faculty of Education and Society

Menu

The sweet smell of success: how can we help educators develop a ‘nose’ for evidence they can use in the classroom?

By Blog Editor, IOE Digital, on 8 February 2018

Mutlu Cukurova and Rose Luckin
A good nose for what constitutes solid evidence: it’s something a scientist is lost without. This finely tuned ‘nose’ is not innate, it is the result of years of practice and learning. This practice and learning through constantly questioning and seeking evidence for decisions and beliefs is something that we academics apply equally to our teaching as to our research. However, recent headlines cast doubt on the belief that other practitioners are able to make good use of research. An article on the TES website argues that “Teacher involvement in research has no impact on pupils’ outcomes”. Can this really be true? If so, what can we do to ensure that the billions of pounds spent on educational research are made accessible to, and used by, our educators?
The realisation that this evidence-informed ‘nose’ is not necessarily shared by many of those involved in education, and in particular those involved in the design and use of technology for education, has also became starkly apparent to us through our development of a training course to help entrepreneurs and educators to understand research evidence. This enterprise is part of the EDUCATE project at the UCL Knowledge Lab.
One of our aims is for course participants to know how to interpret the research findings reported from published research studies. Our first stab at course design was far too complex for people who were highly intelligent, but preoccupied with running a business, or teaching a class. We needed to extract the essence of the course and, as Einstein is reported to have said, make it as simple as it could be, but no simpler. The second cohort of participants benefited from this much-improved offering.
This experience has confirmed the belief that we do not spend long enough in our education system helping people to understand what evidence really is and how to make their own judgements about what they believe, how they are going to act as a result of these beliefs and how they can apply the evidence related to what they believe to their everyday practice. We are particularly concerned with the way in which educators can and cannot access research about teaching and learning and successfully apply it to their everyday practice.
The Uk Government’s investment of £125 million to found the Education Endowment Foundation (EEF) in 2011 with the aim of improving evidence-based practice in education; and the US, Every Student Succeeds Act of 2015, which encourages the use of educational practice that adopts evidence standards from experimental or quasi-experimental evaluation studies, show heartening interest in research based practice. However, how readily can practising educators access and use evidence in their teaching?
The EEF recently funded two large research trials involving 13,323 English primary schools to test the effectiveness of some of the commonly used ways of disseminating research evidence (see IOE blog).
Sadly, the results show no evidence that any of the interventions had a positive impact on students’ Key Stage 2 English scores. It comes as no surprise that the passive approach of essentially sending schools information was not successful, as there are at least two decades of research demonstrating that such passive approaches in the public services are not likely to be effective (Nutley, Walter, & Davies, 2007).
It is legitimate to question why valuable funding was spent on testing such an approach, when it was already known to be ineffective, but a more serious concern is raised by the lack of success achieved through the ‘active’ interventions, such as providing support to engage with evidence.
Previous meta-reviews of existing research clearly indicate that the most frequently reported barriers to evidence uptake by practitioners are:

  • Poor access to good quality relevant research, and
  • Lack of timely research output.

The most frequently reported facilitators of research uptake are:

  • Collaboration between researchers and policymakers (and practitioners), and
  • Improved relationships and skills

(See for instance Oliver et al., 2014 or Langer, Tripney, Gough, 2016).
Looking at the conclusions of these influential reviews, for research evidence to have an impact on practice, three main requirements should be satisfied:

  1. opportunities for practitioners to engage with the interventions,
  2. Practitioner motivations to do so, and
  3. the skills and capabilities to understand and use the research outputs.

Therefore, any kind of evaluation of an ‘active intervention’ to engage practitioners in using research evidence would be expected to clearly tick some of these boxes in its design. However, the report of the EEF trial lacks evidence that any of the ‘active intervention arms’ used in these trials satisfied these key principles: in other words they lacked an evidence-base.
We find this disappointing from the perspective of the funding spent on the EEF evaluation, and the erroneous media headlines it proliferated. However, we feel heartned that our own 8-hour training course might stand a chance of better success. It is based on evidence about what works. In addition to the elements that we have already noted, two further key findings from existing evidence about the impact of research evidence on teachers’ practice are that:

  1. The research evidence must be relevant to the practitioners;
  2. The evidence must provide information about the context from which it emanates.

Research evidence itself is of little or no value for practitioners unless key information about the context from which the evidence was generated is also provided. Here, we use the word ‘context’ to refer to factors that are relevant for learning, including the interactions that learners experience with multiple people, artefacts, and environments. (see Cukurova, Luckin, Baines, 2017, for more information about this).
Our short course in the EDUCATE project focusses on what counts as evidence, how evidence is collected and all is done in the precise context of the course participant. The research with which they engage is directly relevant to their individual needs. We are very keen to evaluate the impact of this course and look forward to presenting evidence about how it has affected the activity of its participants, or not, in a future blog post.
 
 

Print Friendly, PDF & Email

One Response to “The sweet smell of success: how can we help educators develop a ‘nose’ for evidence they can use in the classroom?”

  • 1
    brendanhoare wrote on 8 February 2018:

    Your principles are commendable in terms of involving teachers in interventions, and ensuring research is relevant to contexts. There is, however, something of the patronise that has so alienated teachers in your offer to “train” them. What is needed is a real partnership, and it seems that you may need a little training from teachers if that is to happen