X Close

Institute of Education Blog

Home

Expert opinion from academics at the UCL Institute of Education

Menu

Does class size matter? We’ll get a better answer if we rethink the debate

Blog Editor, IOE Digital13 November 2020

Peter Blatchford and Anthony Russell.

For many teachers, large classes present problems which adversely affect their practice and their pupils’ learning. This is what our surveys show. But researchers and commentators often have a different view. For them the class size debate can be summed up with the question: does class size affect pupil attainment?

As we show in our new open access book, ‘Rethinking Class Size: The Complex Story of Impact on Teaching and Learning’, published by UCL Press this week, researchers (contrary to a practitioner view) commonly find that the statistical association between class size and attainment is not marked and so conclude that class size does not matter much. This has led some to even suggest that we could raise class sizes, and instead invest savings in professional development for teachers. Currently, in the wake of the Covid pandemic and teacher absences, there are reports of some schools being forced to create supersized classes of 60 pupils.

The view that class size is not important is probably the predominant view among researchers and policy makers, and so they may be relatively relaxed about increases in class size. We therefore need – more than ever – good quality evidence on class size effects, but in our view much research is limited and leads to misleading conclusions.

We identify three problems. (more…)

‘Tell me a story from your head!’ – working with parents to design an app to help spark ideas

Blog Editor, IOE Digital1 May 2020

Sara Kalantari.  

Parents and children have been telling each other stories from time immemorial. In the extraordinary days we are living through, storytelling could take on increased importance, as families spend more time at home together seeking ways to make sense of this strange new world.

Studies have found that storytelling is embedded in family life from toddlerhood into adulthood. It strengthens the way children and parents jointly understand their culture and identities while introducing children to concrete and abstract concepts, relationships and information. In school it supports children’s verbal and listening skills, active participation and cooperation, imagination and creativity.

In the DROPS project our UCL team is designing a tablet-based app supporting oral (more…)

Why education research needs working papers

Blog Editor, IOE Digital29 May 2018

Alice Sullivan. 
British education journals often object to the early publication of research findings in the form of working papers (also known as preprints. But would greater use of working papers be beneficial for the health of education research in the UK?
Working papers allow authors to get early feedback on their work from their peers. They also allow us to share our findings with both academic and wider audiences quickly. Education researchers are expected to achieve ‘impact’ – or, at the very least, to communicate our findings to policymakers, practitioners and parents. These audiences need timely access to research findings. Research is publicly funded, and it is therefore reasonable to expect it to be publicly available. Yet years can elapse between the first submission of a paper and its final publication, even without allowing for rejections along the way. The growth in submissions to journals, combined with increased unwillingness on the part of overstretched academics to carry out peer reviews, has seen a crisis in both the quality of the peer-review system and its speed.
Working papers enable researchers to (more…)

Queen's Anniversary Prize: a time for reflection

Blog Editor, IOE Digital20 November 2015

Chris Husbands
I’ve already written about my own departure from the IOE – leaving, in just a few weeks’ time, to become Vice Chancellor at Sheffield Hallam University. As we all know, leaving one job and starting another is  a time of mixed emotions: the combination of apprehension and excitement, the sense of the unfinished business which will remain forever unfinished, the opportunity, albeit briefly, to take stock. It’s in this context that I reflect on the award of a Queen’s Anniversary prize for Higher Education to the Institute of Education. (more…)

Free school effects: an impartial review

Blog Editor, IOE Digital7 July 2015

Francis Green
Do free schools raise the performance of nearby schools, as Policy Exchange have claimed, or is this a statistical mirage?
There is a plausible argument that the opening of a free school in an area might spur other schools to improve in the long run: it is a basic tenet of market competition that producers – in this case schools – will respond to such pressure. But many other factors lie behind good management of schools – including working with the community – so it is far from clear how much effect the opening of a new local school would have in practice, and whether the fact that it was a free school would make a difference. (more…)

'Knowledge exchange' between researchers and practitioners must be a two-way street

Blog Editor, IOE Digital27 May 2014

 Louise Stoll and Chris Brown
Both of us are fascinated by how research finds its way into policy and practice. Most researchers hope their findings will be used, but engaging people isn’t always easy or straightforward.
It’s good to see an increase in initiatives focusing on this challenge – for example, the Education Endowment Fund’s recent call for bids in relation to encouraging the uptake of research in schools. Many terms are used to describe the process – dissemination, knowledge transfer, knowledge mobilisation, research utilisation to name a few. Whatever their intention, the message they can convey to practitioners is that researchers have the knowledge that practitioners need to receive.
Our attention has been caught, though, by the term ‘knowledge exchange’. This suggests a two-way flow in a more equal relationship, which makes a lot of sense. Everyone has their own knowledge and experience to share and research can enrich this, as well as pushing researchers to think again about what their findings mean in different contexts.
An R&D project, funded through the Economic and Social Research Council’s (ESRC’s) Knowledge Exchange Opportunities Scheme has been giving us the opportunity to explore researcher/practitioner relationships in more depth. Over the last six months, along with our colleagues Karen Spence-Thomas and Carol Taylor, we have been working with Challenge Partners, a group of more than 230 state-funded schools across England that work collaboratively to enhance the quality of their teaching and leadership, with an ultimate aim of improving outcomes for children and young people. Challenge Partners (CP) aim to provide a vehicle for their schools to learn from the best of their peers.
Our project has been adding research into this mix. It’s exploring and learning about establishing an expanding and sustainable network of middle leaders (such as department heads, subject leaders and key stage leaders) across CP schools that can: exchange evidence-informed knowledge about effective middle leadership that changes teacher practice; track its impact; and find powerful ways to exchange the outcomes of their applied project work more widely within and beyond the partnership to benefit a broader range of educators. For a summary of our project questions and the project, see: ESRC Middle Leaders Project.
In workshops, we share both research findings and effective practice. These are then blended together to create new knowledge that middle leaders use to design and refine processes and tools to help them lead more effectively and track their impact. In between sessions, the middle leaders test new ideas and trial tools with colleagues and teams both in their own and in other schools. They do this both via face-to-face engagement and through social networking. With us they will also be developing processes to embed the notion of sharing high quality research-informed practice between schools in their own networks and for practitioners in other networks. We have a parallel evaluation strand where our researchers and researchers from two Challenge Partners (CP) schools and the CP office are collecting baseline and follow up information, and following project activities.
Partnership is absolutely critical. We co-designed the project with CP, are now involving middle leaders in planning and facilitating sessions, and are co-evaluating the project. Through this, we are trying to model knowledge exchange and collaboration by drawing on the expertise and practices of researchers, knowledge exchange professionals (a term used by the ESRC to describe people who help translate research findings) and practitioners. We hope this will increase the project’s potential to benefit the middle leaders and their colleagues and pupils.
Ours is a two-way relationship: we are learning from our partners as well as them from us, and we have combined our research knowledge, Challenge Partners’ prior experience and published knowledge, and the middle leaders’ knowledge. At times this challenges our thinking – we are tracking this as well – but we know that powerful professional learning does just that.
We will be back with an update in a few months.
 

Understanding impact: what does it actually mean?

Blog Editor, IOE Digital9 May 2014

Chris Husbands 
Research changes lives. Somewhere along the way, every research project involves a question which is about making a difference to the way people think, behave or interact. In applied social science – the field in which the IOE excels – research sets out to understand, shape and change the world around us.
The idea of research ‘impact’ has become important in research management and funding. Universities are now held to account for the impact of their research, not least through the Research Excellence Framework. But ideas about ‘impact’ and how to secure it vary.
In popular accounts of science, research is sometimes described as ‘earth shattering’, as if it creates something like a meteorite crater reshaping the landscape for ever. An example might be Frederick Sanger’s development of rapid DNA sequencing in the 1970s, which has transformed practices across any number of fields.
But there are other images to describe impact. Not all research has what the LSE research team looking at impact call an ‘auditable’ consequence. They comment that research applied in practice is “always routinized and simplified in use” so that over time, the impact fades like ripples in a pond.
The University of the Arts talks of its research as having creative outcomes that enhance cultural life and provides examples of artworks which create possibilities and shift perceptions: ideas which float into the air like dandelion seeds.
The impact of some research is apparent quickly – though almost never as rapidly as the tabloid newspapers which almost weekly trumpet miracle breakthroughs would have us believe – whereas in other cases it can take decades before the value of research becomes apparent.
Not only does the IOE itself undertake research which seeks to have an impact, it’s also interested in understanding what impact looks like, what it means and how it happens. At a recent conference we explored the linked questions of research impact and public engagement: the relationships between research, policy, practice and improvement, are things some of my colleagues try to understand.
The ESRC defines research impact as “the demonstrable contribution that excellent research makes to society and the economy“. This suggests three components: the demonstrable nature of the contribution, the quality of the research, and the focus on both society and the economy. Successful impact means holding all three in creative relationship: without any of them, the other two are diminished. Research which is not excellent will not make a demonstrable contribution; research which sits unpublished and unread will not, whatever its methodological sophistication, make a demonstrable contribution and so on.
Understandings of impact – or of knowledge mobilisation and its dynamics – have been transformed over the last fifteen years, as the barriers to, and strategies for making creative use of research to impact more directly on people’s lives have become clearer and ways of engaging the public in the dynamics of research have developed. No research – however excellent – ever simply has an ‘impact’.
Richard Doll discovered that smoking caused lung cancer in the 1950s, but it took several years and active public health campaigns to change behaviour. In education, the gap between, say, research on assessment for learning (AfL) and AfL practice suggests that – like the idea of the droplet making ripples on a pond – the impact of research can quickly dissipate unless something active is done.
Research always needs mediating – or, to put it differently, research impact needs a plan. Academics used to talk about ‘dissemination’, but thinking has moved far beyond such models – “I research, you listen” – to more creative and nuanced understanding of the ways knowledge moves – and does not move – around organisations and society. We have learnt that while these relationships are complex, they can be managed effectively.
In the early days of work on research impact, thinking focused on ‘what works’, on the assumption that research could tell us what techniques have maximum effectiveness, and that this could in some way be taken to scale by more or less sophisticated dissemination techniques. We have become – individually, institutionally, collectively – more sophisticated than that, and we have done so quickly. We know that ‘how it works’ and ‘why it works’ are just as important and that the effort to link the worlds of research, policy and practice involve commitment and engagement from all parties. In Ontario, the Knowledge Network for Applied Education Research tries to draw key groups together to enhance knowledge mobilisation. Bringing about change in practices is never easy, as anyone who has ever tried to lose weight, get fitter or learn a new language knows.
There’s a nice social enterprise quotation: “success does not matter. Impact does”. The IOE is a socially engaged university. We care about the quality of the research we undertake, and we make sure that it is of the highest quality. But we care equally about the way our research shapes the society it is seeking to understand. We understand that research evidence will always be only one of the factors that influences society, and that other factors always intervene. But we also know that progress has been made in the past in this field and more can be made in future with persistent effort.
For us, ‘impact’ is not an artefact of the 2014 REF, nor an obligatory hoop through which to jump. There is a wonderful line from the 2008 strategic plan for Carnegie Mellon University – and very, very few university strategic plans contain quotable lines. But in 2008 Carnegie Mellon got it right: “we measure excellence by the impact we have on making the world better”.
 

AERA reminds us that education research is part of a genuinely global discourse

Blog Editor, IOE Digital8 April 2014

Chris Husbands
The annual conference of the American Education Research Association cannot really be described: it has to be experienced. Every year, it attracts almost 20,000 education researchers, not just from North America but from the entire English speaking world, and, in the last decade, increasingly from East Asia. So any individual experience of the conference must still be partial.
For five days, AERA takes over the downtown of a large American city, so the sheer logistics of running the annual conference must be mind boggling. The conference programme is the size of a telephone directory and about as readable: even the app which has been available for the last few years takes some navigation. You have to really know what you are looking for to master the search function, but if you only want to browse it’s difficult – although the AERA2014 app does contain abstracts for the thousands of papers.
In essence, AERA is not one conference but several. AERA is organised into 12 divisions, from administration, organisation and leadership (Division A) through to Education Policy and Politics (Division L), taking in Measurement and Research methodology (Division D) and Learning and Instruction (Division C) with much else besides. Each division runs several parallel sessions at any one time. Then there is the conference of the highlights: the large, set piece lectures and panels led by genuine global stars such as Diane Ravitch (this year on the challenges of quality and equality), Andreas Schleicher (this year on why we should care about international comparisons), Charles Payne (in 2014 on the fiftieth anniversary of the Civil Rights Act) and Linda Darling-Hammond (on issues in the validity of high stakes assessments): their sessions fill the ballrooms of large hotels, standing room only.
Then there is the conference of the post-doctoral researchers, for whom AERA is a grand hiring fair – a good 20-minute performance reporting on your doctorate to a room of perhaps nine people can be instrumental in landing a prestigious position. And of course there is the conference of the corridors: knots of people meeting up to compare experiences of research funding and research policy, to complain about their miserable lot, to plot and to scheme and to gossip, to broker deals and agreements – people who have not seen each other since San Francisco last year and won’t meet again until Chicago next year.
And the range is huge: to deploy some (all too frequently observed) stereotypes, sessions on structural equation modelling led by earnest young think tank econometricians in sharp blazers, sessions on the endless reverberations of race in American education full of lively, disputatious people of colour, sessions on urban school reform led by harassed school superintendents looking for better teacher or school evaluation strategies.
This year’s conference (April 3-7) was in Philadelphia – the conference is always in one of those vast American cities where a wrong turn at one block will take you into parts of town where you’ll come across urban Americans uninterested in the finer points of methodology – and its over-arching theme was “the power of education research for innovation in practice and policy”. Barbara Schneider (Michigan State University), this year’s president, chose to speak about the “college mismatch problem”: why American teenagers from poor backgrounds apply to universities of lower status than their grades could get them into; Ruby Takanishi from the New America Foundation and Rachel Gordon from the University of Illinois looked at what we are learning from universal preschool education.
There are major methodological innovations: the impact of learning analytics on the knowledge base for lifelong learning, what the evidence is saying about recent immigration and its consequences for education. But all this makes it sound too ordered. Opening the telephone directory programme randomly I find ”an Australian perspective on inequality and education”, “blacks, hip-hop and the sociocultural milieu”, “dental school deans’ perception of dental education costs”, ”does teacher and student race congruence help or hinder student engagement in ninth grade science”, “ the common core standards and teacher quality reform” , and “comparing three estimation approaches for the Rasch Testlet model”: and on and on through literally thousands of sessions.
It’s almost impossible to discern trends, though economists seem to be growing in number and influence; ‘big data’ and its promises and pitfalls pre-occupy more people; and even in America – that most inward looking of melting pots – questions of international comparison and globalisation are more than ever in evidence. Being at AERA is a reminder of the similarities and differences between American and English experience in education.
There are some common themes: the relationships between quality and equity, between social structure, education experiences and performance, between the dynamics of research and the dynamics of policy. Others look similar but are really different: academies, for example, are not, in the last analysis, quite the same as charter schools. Others are genuinely different: the American experience of urban school reform is not the English experience; America’s experience in curriculum reform and teacher education has been quite different from England’s.
AERA is always simultaneously disorienting – you inevitably feel you are in the wrong place, that there is a more interesting and important session just around the corner – and energising – thousands of exceptionally able and engaged people enthused about education, and above all reminding us that education research is part of a genuinely global discourse.

Time to re-think the unthinkable: how can we get our research messages discussed by politicians?

Blog Editor, IOE Digital23 September 2013

Chris Brown

The party conference season is a useful barometer for those who champion the more widespread use of evidence within policy making. Among the announcements and denouncements, we start to get an understanding of the gamut of policy positions being developed by the main political parties and, importantly, by those who advise them. These are, to use the ancient Greek idea, the nascent policy “agoras” (pdf), or gathering places for policy.
They matter because they illustrate that whoever wins the election will have already devised their manifesto for government. This positioning of perspectives will also frame the nature of the evidence policy-makers will or won’t engage with once in office. Clearly the scope of any policy agora (the breadth of the arguments it contains) depends on the extent to which ministers wish to let their civil servants investigate potential solutions for particular policy problems. But if the trend set by the current education secretary continues, then the positioning both of evidence and of those who offer advice worth listening to, is something that will need to happen long before the electioneering for 2015 has even commenced.

The year ahead, as a result, represents the period when we can work with potential future governments to re-think the unthinkable: to champion new ideas at the expense of the current ones and to reposition the country’s journey over the course of the next electoral cycle. This of course takes time and effort, but it also requires an understanding of the appropriate strategies to employ.
Historically academics, in addition to their day to day business of writing journal articles, have been encouraged to ensure that their research outputs are both digestible and applicable: that what they write can not only be easily understood, but that it is also immediately ‘policy ready’. Often efforts to do so result in frustration. This is because while useful, these two qualities alone are unlikely to lead to a greater uptake of research by policy-makers: ideas may still sit outside of the policy-agora or policy-makers may simply fail to see any need to act on what is presented. Importantly, then, what is also required is substantial ground-work to enhance the “social robustness” of any idea – to promote its importance and the need to act as a result.
Efforts to enhance social robustness can be directed via the general media, social media or through cultivating links with special advisors and others who matter, but the ultimate endgame of this action is to advance ideas towards what Malcolm Gladwell describes as the “tipping point“: ensuring issues enter and dominate the mainstream and so must be addressed.
As well as relating to general ideas, however, we can also direct similar efforts towards promoting ourselves as experts, whose advice should be sought. Again, the result is the same, with those considered to be worth listening to finding it easier to catch the ears of policy makers than those who are not (as can be evidenced by those particularly skilled in this approach – Ben Goldacre for instance, provides a prime example of what can be achieved here). So let’s watch this week’s Labour Party conference with interest and see if we can assess not only which of our research might be in favour, but also whether there is scope for enhancing the social robustness of the messages that are not – and make sure they are ready in time for next year.
Dr. Chris Brown’s new book, Making Evidence Matter, published by IOE Press, is out now

Evidence-based practice: why number-crunching tells only part of the story

Blog Editor, IOE Digital14 March 2013

Rebecca Allen
As a quantitative researcher in education I am delighted that Ben Goldacre – whose report  Building Evidence into Education was published today – has lent his very public voice to the call for greater use of randomised controlled trials (RCTs) to inform educational policy-making and teaching practice.
I admit that I am a direct beneficiary of this groundswell of support. I am part of an international team running a large RCT to study motivation and engagement in 16-year-old students, funded by the Education Endowment Foundation. And we are at the design stage for a new RCT testing a programme to improve secondary school departmental practice.
The research design in each of these studies will give us a high degree of confidence in the policy recommendations we are able to make.
Government funding for RCTs is very welcome, but with all this support why is there a need for Goldacre to say anything at all about educational research? One hope is that teachers hear and respond to his call for a culture shift, recognise that “we don’t know what’s best here” and embrace the idea of taking part in this research (and indeed suggest teaching programmes themselves).

It is very time-consuming and expensive to get schools to take part in RCTs, (because most say no). Drop-out of schools during trials can be high, especially where the school has been randomised into an intervention they would rather not have, and it is difficult to get the data we need to measure the impact of the intervention on time..
However, RCTs cannot sit in a research vacuum.
Ben Goldacre does recognise that different methods are useful for answering different questions, so a conversation needs to be started about where the balance of research funding for different types of educational research best lies.
It is important that RCTs sit alongside a large and active body of qualitative and quantitative educational research. One reason is that those designing RCTs have to design a “treatment” – this is the policy or programme that is being tested to see if it works. This design has to come from somewhere, since without infinite time and money we cannot simply draw up a list of all possible interventions and then test them one by one. To produce our best guess of what works we may use surveys, interviews and observational visits that took place as part of a qualitative evaluation of a similar policy in the past. We also used descriptions collected by ethnographers (researchers who are “people watchers”). And of course we draw on existing quantitative data, such as exam results.
All of this qualitative and quantitative research is expensive to carry out, but without it we would have a poorly designed treatment with little likelihood of any impact on teacher practice. We may find that, without the experience of other research, we might carry out the programme we are testing poorly for reasons we failed to anticipate.
The social science model of research is not ‘what works?’ but rather ‘what works for whom and under what conditions?’
Education and medicine do indeed have some similarities, but the social context in which a child learns shapes outcomes far more than it does the response of a body to a new drug. RCTs may tell us something about what works for the schools involved in the experiment, but less about what might work in other social contexts with different types of teachers and children. Researchers call this the problem of external validity. Our current RCT will tell us something about appropriate pupil motivation and engagement interventions for 16-year-old teenagers in relatively deprived schools, but little that is useful for understanding 10-year-old children or indeed 16-year-olds in grammar schools or in Italian schools.
The challenge of external validity cannot be underestimated in educational settings. RCTs cannot give us THE answer; they give us AN answer. And its validity declines as we try to implement the policy in different settings and over different time frames. This actually poses something of a challenge to the current model of recruiting schools to RCTs, where many have used “convenient” samples, such as a group of schools in an academy chain who are committed to carrying out educational research. This may provide valuable information to the chain about best practice for its own schools, but cannot tell us how the same intervention would work across the whole country.
Social contexts change faster than evolution changes our bodies. Whilst I would guess that taking a paracetamol will still relieve a headache in 50 years’ time, I suspect that the best intervention to improve pupil motivation and engagement will look very different to those we are testing in an RCT today. This means that our knowledge base of “what works” in education will always decay and we will have to constantly find new research money to watch how policies evolve as contexts change and to re-test old programmes in new social settings.