X Close

Centre for Education Policy and Equalising Opportunities (CEPEO)

Home

We create research to improve the education system and equalise opportunities for all.

Menu

Archive for the 'CEPEO' Category

The 2021 Autumn Budget and Spending Review: what does it mean for educational inequalities?

Blog Editor28 October 2021

Claire Crawford

The pandemic has disrupted life for everyone, but children and young people have seen perhaps the biggest changes to their day-to-day lives, with long periods spent away from school and their friends leading to significant rises in mental health difficulties and a substantial reduction in learning. Moreover, these challenges have not been felt equally: the evidence suggests that the pandemic has also led to a rise in inequalities between children from different socio-economic backgrounds, from the early years through to secondary school and beyond.

A budget and multi-year spending review delivered against a backdrop of the highest peace-time borrowing levels ever, and by a chancellor on a ‘moral’ mission to limit the size of the state, was unlikely to deliver the sort of investments in education that Sir Kevan Collins hoped to see when he took the role of ‘catch-up tsar’ earlier this year. But what did it deliver for education? And is it likely to help roll back the rises in educational inequalities that the pandemic has generated?

Early years

While it is positive to see some recognition of the need for a higher funding rate to be paid to early education providers to cover the delivery of the early education entitlements for 2, 3 and 4 year olds, the amount earmarked – £170m in 2024-25 – does not represent the substantial investment that many in the sector have been calling for: certainly nowhere near the £2.60 per hour increase that was estimated to be needed to fully fund the entitlement, enabling providers to deliver these hours without incurring a loss, or by charging for ‘extras’ (such as food or nappies) or increasing fees for other children in order to cover costs.

We await the details of exactly what this means for the official funding rate per hour. Still, for some idea of scale, spending on all early education entitlements – the universal 15 hour entitlement for 3 and 4 year olds, the additional 15 hours for 3 and 4 year olds via the extended entitlement, and the 15 hour entitlement for disadvantaged 2 year olds – was around £3.8bn in 2019-20. 170m represents less than a 5% increase on this figure. Putting it another way, in 2019-20, a total of around 1.75 million children were benefitting from each of the free early education entitlements. If the number of children taking up these places was to remain unchanged between 2019-20 and 2024-25, this suggests that early education providers would only receive around £100 per year more per child than they do now. In reality, the population of 2, 3 and 4 year olds is expected to fall over the next few years, which – when coupled with the reduction in take-up of the early education entitlements that we have seen over the course of the pandemic – may mean that the actual increase in funding rates is higher than 5%. But not much higher.

Likewise, while greater investment in family support services is also welcome, the much-trumpeted £500m increase represents less than half of the reduction in spending on Sure Start Children’s Centres that has taken place over the last decade, falling by over £1bn (around two thirds) in real-terms from a peak of around £1.8bn in 2009-10. A start, perhaps, but not the transformative ‘Start for Life’ that the rhetoric surrounding this announcement would suggest.

Schools

Yesterday’s announcements on schools were dominated by the news that school funding would return to real-terms levels last seen in 2010. Not much to write home about, you might think. But there was also only a small amount of additional money for education catch-up, including an increase in the ‘recovery premium’ – catch-up money targeted towards pupils from lower income families – for secondary school pupils. While it is positive to see funds being targeted towards the pupils most in need of support, our work has shown that the differences in remote learning experiences while schools were closed to most pupils varied substantially by socio-economic background, and whether the roughly £5bn allocated to catch-up will be enough to redress the balance is unclear. It certainly amounts to a lot less than is being spent per pupil in other countries.

Further and higher education

Despite rumours circulating in the media, the decision on the funding of higher education was kicked into the long grass yet again, with the words ‘higher education’ mentioned only three times in the Budget and Spending Review document, and more information promised “in the coming weeks”.

Meanwhile, the eye-catching nominal and real-terms increases announced for further education (FE) and skills look decidedly less generous once account is taken of the fact that we are about to experience a massive increase in the population of 16-19 year olds. The document itself acknowledges that while there will be a 28% real-terms increase in 16-19 funding in 2024-25 compared to 2019-20, this will only maintain – rather than increase – funding per student in real terms. Despite a much greater emphasis in policy discourse about the importance of further education and adult learning than we have seen in recent years, this settlement does not suggest a transformation of the fortunes of the FE sector, which caters to the majority of each academic cohort and in which young people from lower socio-economic backgrounds are over-represented.

Implications for inequalities

Perhaps contrary to expectations, yesterday’s spending review contained increases in spending for most government departments, paid for by the highest tax rises in nearly 30 years. But given the significant challenges posed by the pandemic for children and young people, the Department for Education’s budget will be only a little higher in 2024-25 than it was in 2009-10, while the Department of Health and Social Care budget will have increased by over 40%.

The thinking seems to be that children will catch-up over time anyway. But the evidence suggests that inequalities in educational attainment only increase as children get older: higher socio-economic status parents can provide more opportunities for learning – through better schools, tutoring or more academic and non-academic enrichment activities – than lower socio-economic status parents, and these investments cumulate over time, widening the gap between those from different backgrounds. The same will be true of parents’ ability to support their children to ‘catch-up’ on what they lost during the pandemic.

Without significant government investment to support children from more disadvantaged backgrounds, the wider inequalities that have opened up over the course of the pandemic are likely to foreshadow even greater inequalities in future. Yesterday’s spending review offered some support – but nowhere near enough.

Learning About Culture: The importance of arts-based learning, the limits of what we know about it, and the challenges of evaluating it

Blog Editor8 September 2021

Jake Anders, Kim Bohling, Nikki Shure and Alex Sutherland

There is little doubt about the importance of arts and culture to the education and upbringing of young people. Arts-based education gives young people an important means of creative expression and “arts for arts’ sake” is the best argument for having arts-based education in schools. However, far less is known about the link specifically between arts-based learning activities and pupils’ educational outcomes – partially due to a lack of robust studies on this topic. Yet this is a link that is often invoked as part of the overall importance of these programmes, partly in response to a perception that an increased focus on “core educational outcomes” is squeezing arts-based education out of schooling.

Over the past four years, a team from UCL and the Behavioural Insights Team has been working with the Education Endowment Foundation (EEF), the Royal Society for the Arts (RSA) and five arts-based education organisations on a project called Learning About Culture (see Table 1 below for programme detail). At the heart of this project are five randomised controlled trials (RCTs) involving around 8,500 children in 400 state schools across England. These evaluations were designed to look at the impact of five specific arts-based learning interventions on literacy outcomes. To our knowledge, these trials represent the largest collection of RCTs testing arts-based approaches on attainment outcomes. This body of research represents a significant step forward in understanding how to assess the relationship between creative activities and pupil outcomes, which is in itself important.

Each of the programme reports is linked to below and an overarching report that synthesises the findings, lessons, and recommendations can be found here. What you’ll immediately notice is the diversity of approaches we looked at – including music, storytelling, and journalism – reflecting the richness and diversity of the sector.

Table 1. Learning about Culture programmes

Each programme name is hyperlinked to the EEF project page.

Programme name (Developer): Description:
First Thing Music (Tees Valley Music Service) Programme to train teachers in the Kodály method of music instruction in order to deliver daily a structured, sequential music curriculum of increasing progression (Key Stage 1)
Speech Bubbles (London Bubble) Weekly drama and storytelling intervention aimed at supporting children’s communication skills, confidence, and wellbeing. (Key Stage 1)
The Craft of Writing

(Arvon, University of Exeter, Open University)

Programme to develop teachers as writers combined with explicit focus on pedagogical implications for the classroom. (Key Stage 2)
The Power of Pictures

(Centre for Literacy in Primary Education)

Specialist training from published author-illustrators and expert teachers helps primary teachers to develop their understanding of the craft of picture book creation. (Key Stage 2)
Young Journalist Academy (Paradigm Arts) The project aims to develop pupils’ writing by involving them in journalism. In doing so, it aims to provide pupils with a meaningful purpose for writing and teach specific writing techniques. (Key Stage 2)

What did we find?

When compared to ‘business as usual,’ we were unable to find improvements in pupil attainment in any of the five trials that we could reliably say weren’t due to chance. However, it’s important to emphasise that this is an extremely challenging barrier to clear and the fact of the matter is that most of the trials that the EEF funds don’t find impacts of interventions on pupil learning outcomes.

While it is easy to focus on the lack of a positive impact in the outcome measures, we also want to emphasise the trials found no evidence of detrimental effects from introducing such programmes. That is actually really good news, because it means that including arts-based programmes alongside ‘core curriculum’ subjects isn’t a zero-sum game where increasing time on arts means lower grades elsewhere.

And, as we pointed out above, improving pupil academic attainment is not the best or only reason for schools to implement arts-based interventions in schools. Although they did not improve literacy test scores, in interviews with participating teachers and pupils, we found that the programmes generated a great deal of enthusiasm among the teachers and pupils who took part in them. Perceived improved pupil engagement was a theme that emerged from the implementation and process evaluations across the five programmes.

In the overarching report, we also stress that these results should absolutely not be seen as the last word in whether arts-based learning is effective in improving outcomes for pupils. Necessarily, in this kind of research, we focused on one set of outcomes, which could be quantified and measured over a fairly short time horizon. But benefits could accrue in many other ways that we just couldn’t capture. For one, having these initiatives available to pupils may have long term consequences for the subjects these pupils choose at GCSE or A level or the career paths they choose to follow. We don’t know that there are these benefits, either, but our evidence shouldn’t be used to discount such possibilities.

Our reflections as evaluators

The overarching report contains thoughts and lessons for multiple audiences: researchers, funders, and arts organisations. For brevity, we’ve only selected a few takeaways to highlight here.

Evaluators and funders

There is a line of argument against our efforts here that what we can measure in trials (and research more broadly) is not always what ‘matters’, or what we ‘should’ measure. Equally, some will point to challenges in measuring what we did use as outcomes, as well. We know that the measures used are imperfect, but given the choice between imperfect measurement of something versus perfect measurement of nothing – or something further removed from the intervention – then we stand by our decision to do what we can in an imperfect world. This isn’t an abstract research issue: in order to be able to ascertain whether something is effective (or not) we need to be clear what we expect to change and measure that as best we can.

In line with EEF’s policy, reflecting their primary aim as an organisation, our impact evaluations focused on measuring pupil attainment outcomes. While this approach has many strengths given the undoubted importance of such outcomes, these projects – where we see positive signs of engagement based on the implementation and process evaluation but ultimately no impacts on our measured outcomes – highlight one of its key limitations: a null finding leaves a lot of unanswered questions. An alternative approach – with similarities to the increased emphasis on ‘mechanism experiments’ in economics and particularly important where there is a limited evidence base about how interventions work – would focus first on establishing whether the interventions do indeed affect the intermediate steps via which they are thought to improve attainment. This would help us first to establish whether the programme is working as we think it does or if there is more to be done to understand this crucial first stage to achieving impact on pupils’ academic attainment.

Arts organisations

We really appreciate the courage and commitment from the arts-based education organisations who put themselves forward to participate in a multi-year evaluation process. The EEF’s support for both an individual and overarching approach to the evaluation meant that we were able to observe themes across the programmes that could be useful to other arts organisations. From these themes, we offer some recommendations for consideration.

Ensure buy-in and engagement from school staff at multiple levels.

High teacher buy-in was crucial for the day-to-day delivery of the programme, and senior leadership team (SLT) buy-in was important for supporting the teacher in high-quality delivery. For example, SLT members were able to ensure teachers had access to necessary resources and space, as well as ensure there was time in the timetable for the programme.

Carefully consider programme resource requirements and test assumptions about what’s available in schools

The interventions placed different demands on schools in terms of the resources needed to take part, and even where required resources were considered ‘standard’, challenges were still reported. In some cases, schools did not have resources, such as arts supplies, that were assumed to be available in most schools. In other cases, the schools had the required resources, such as technological equipment, but they were difficult to access. Organisations may want to consider how to surface these challenges early in set-up and whether they can provide any support to schools in overcoming them.

On a more personal note

As independent evaluators, we have a responsibility to be as objective as possible, recognise our biases, and do our best to minimise their influence on our work. We are also all researchers who care deeply about improving outcomes for pupils and furthering our understanding of ‘what works’ to support pupil development. When we are able to take the ‘evaluator hat’ off, this team also broadly supports the inclusion of arts in the school day, and some of us have direct experience of delivering arts-based learning opportunities either in the school day or extended learning space. We would have been thrilled to report that the programmes had a significant impact on attainment outcomes – not only to further enhance the toolkit for improving pupil outcomes, but also to secure further protection for the arts in the school day. Ultimately, we are not able to report those outcomes, and we stand by the findings of the six reports produced. We are still supporters of arts in education and we also enthusiastically support further research in this space, as there is certainly more to learn.

A-levelling up: the thorny path back from teacher assessed grades

Blog Editor12 August 2021

By Jake Anders, Claire Crawford, and Gill Wyness
This piece first appeared on theGuardian.com.

This week’s GCSE and A level results confirmed the expectations of many who study education policy: the proportion of students achieving top grades in these qualifications has increased substantially compared to 2019, especially at A level. Students themselves should be extremely proud of their results, which were achieved under very difficult circumstances. Likewise, teachers have worked extremely hard to make the best assessment they can of their pupils’ performance. But there is no getting around the fact that these results are different – and not directly comparable with – pre-Covid results.

It is right to allow for the fact that students taking GCSEs and A levels this year and last are at a disadvantage compared to previous cohorts. In-person exams would have been next to impossible in 2020, and those assessed this year have missed significant amounts of schooling.

To deal with this, the government chose an entirely different means of measuring performance: teacher assessments. (We advocated a different approach, based on more flexible exams, in 2021.) This year’s approach has been rather more orderly than last year’s chaos, but the wide range of measures that teachers could consider – such as mock exams, in-class tests and coursework – inevitably led to variation in how schools assessed their pupils.

This year’s grades may also be capturing average or ‘best’ performance across a range of pieces of work, rather than a snapshot from one or two exams. This seems to have been particularly true at A level, where grades have immediate consequences for university entry decisions. In short, it is unsurprising that grades based on teacher assessment are higher than those based on exams alone: while some have called this grade inflation we think it’s more accurate to say that they are capturing different information.

A level grade distribution in 2019, 2020 and 2021

But given they have been presented on the same scale, the stark increase in grades compared to pre-Covid times present significant challenges for current and future cohorts.

Even making comparisons between pupils within the 2021 cohort may be challenging. Using teacher assessment is likely to have disadvantaged some students relative to others. Previous research has shown that Black Caribbean pupils are more likely than white pupils to receive a grade from their teacher below their score in an externally marked test taken at the same time. Similarly, girls have also been found to perform better at coursework, while boys do better at exams on average. Differences by gender have been particularly apparent this year, with girls seeing larger improvements in performance than boys compared to pre-pandemic.

This year’s record high scores raise challenging questions. The much larger proportion of pupils getting As and A*s at A level, for example, may lead to universities relying more heavily on alternative methods of distinguishing between applicants – such as personal statements – which have been shown to entrench (dis)advantage.

There is also the all-important question of what to do next year: are this year’s grade distributions the right starting point, or should we be looking to return to something closer to the 2019 distribution? Is it possible to go back? And would we want to?

Assuming in-person exams are feasible next year, one possibility would be to return to 2019’s system as if nothing had happened. This would probably see substantial reductions in the proportion of students getting top grades, especially at A level. One can only imagine the political challenge of trying to do this.

Even more important is that the next cohorts of GCSE and A level students (and indeed the ones that follows – we are tracking the experiences of those taking GCSEs this year as part of a new UKRI-funded cohort study, COSMO) have also been affected by the pandemic, arguably to a greater degree than this year’s. They are therefore likely to underperform their potential and get lower grades than cohorts who took their exams before the pandemic struck. That is clearly not desirable.

It is important to continue making allowances for the exceptional circumstances young people have faced during this crucial time in their education. During the period affected by pandemic learning loss, our suggestion would be to design exams with more flexibility, allowing candidates to choose which questions to answer based on their strengths, as is common in university exams. This would enable a return to the fairest way to assess students – exams – while still taking account of lost learning.

Either way, any return to exam-based grades is likely to result in an immediate pronounced drop in results compared to the last two years, especially at A level. Gavin Williamson has suggested that the government will aim instead for a “glide path back to a more normal state of affairs”. This would smooth out the unfairness of sharp discontinuities between cohorts. But it would mean moving away from grades being based on the same standard over time, instead setting quotas of students allowed to achieve each grade, gradually reducing the higher grades and increasing the lower ones. Even if that seems a good plan now, it would be very hard to stick to: the fall-out from the small reduction in pass rates seen in Scotland this week would be a taste of things to come for years.

A more radical possibility would be to reset the grading system entirely. This would get around the political issue of there being very large or deliberate small falls in grades for future cohorts, but one wonders whether this is the right time to undertake such a drastic overhaul. The pandemic will have repercussions on young people’s grades for years to come: is the best approach really a total reset right now?

The question of what to do next is one that policymakers will have to grapple with over the coming months and years. Of more fundamental importance and urgency, however, is that pupils have experienced widespread learning losses due to the pandemic – regardless of what their grades show – and are likely to be affected by these for years. Students require ongoing support throughout the rest of their educational careers, including catch up support throughout school, college and university.

We cannot simply award them GCSE and A level grades that try to look past the learning they have lost and move on – the learning loss remains and must be addressed.

Dr Gill Wyness & Dr Jake Anders are deputy directors of the UCL Centre for Education Policy & Equalising Opportunities (CEPEO). Dr Claire Crawford is an associate professor at CEPEO.

The dam waiting to burst? The short-term economic impact of Covid and Lockdown

Blog editor25 June 2021

By Professor Paul Gregg

Lockdown artificially closed down large parts of the economy but to understand where the economy is and will be in the next year or so, it is crucial to make a distinction between economic activity that has been lost and that which has just been delayed. To make this distinction clearer, think of Easter Bank Holidays. Easter normally falls in April but in some years it is in March. In a year when it falls in March, the economic activity for March falls sharply compared to other years, because the Bank Holidays close large parts of the economy. But correspondingly April will see higher output as the economy re-opens. There is no effect here on overall output or underlying economic performance. It is merely delayed by a month.

Lockdown has the same effect. It places a dam in the way of consumer spending, but behind the dam there is a build-up of demand that is released when Lockdown ends and the economy re-opens. This creates a surge of activity. The same can be seen in terms of vacancies. Locked down firms stopped recruiting as they weren’t trading. But staff members were still leaving to start other jobs in open sectors of the economy or leaving the labour force. The positions remain unfilled until the firm re-opens, then we have a surge as 6 months of vacancies appear at once.

There is currently an economic surge building, starting in April as the economy started to re-open but just as economic activity was artificially suppressed in Lockdown, the re-opening will artificially inflate the level of activity above the underlying level. This raises a number of key questions about where the economy is now and is heading. What is the underlying level of economic activity? How much pent-up economic activity is there to be released? Over what period will the surge occur? And what does this mean for government policy, especially for the government’s fiscal position?

Where is the economy now?

The 13 months from the end of February 2020 to the end of March 2021 saw a shortfall in economic activity of 10% compared to pre-crisis levels. April to June 2021 saw the economy start to re-open, with a mix of released activity with still partial closure, meaning rapid growth in activity. So from July, hopefully, a fully re-opened economy will see economic activity not just return to underlying levels but experience a surge from the release of the pent up demand.

The US offers a useful comparator here of underlying activity levels. It has not used Lockdowns so widely as the UK, and has not used a furlough programme to preserve employment, instead focusing on supporting the  incomes of people who lose jobs (more than in normal times).  In the US, economic activity in the first quarter of 2021 was just 1% below that of pre-crisis levels. In the absence of the crisis the economy likely would have grown, so a reasonable figure is that economic activity stands 3% below what would have happened without the crisis. The employment rate is 3% below peak levels and unemployment just over 2% higher. Note that the employment fall has been larger than the GDP fall in the US.  In the UK economic activity was down nearly 8% from pre-crisis levels in the first quarter of 2021. The US situation suggests that at most underlying activity is around 1.5% down in the UK if the artificial effects of enforced Lockdown are stripped out. This is very modest given how scary things looked last year.

How much pent-up economic activity is to come?

There are two parts to gauging the size of this pent-up demand. What has happened to disposable incomes, and the extent of excess saving from that income.

Disposable incomes are about 1.5% down on pre-crisis levels in real terms, reflecting lower employment, the effects of furlough etc. The proportion of incomes saved (the Saving Ratio) in the UK have been over 10% higher than normal since the crisis hit. So there is 10% of peoples’ annual incomes that could be spent to take savings back to normal levels. This is a bit over £3,000 per household.

Now people could consume this slowly over the lifetimes or binge-spend. Evidence from lottery wins suggest large wins see spending on durable goods like a new car but a large portion is saved. Spending more generally is unaffected. Smaller wins see proportionately more spent and less saved.  So people are likely to run this excess saving down over a couple of years and because of the relief as Lockdown ends this is likely to be front-loaded starting from April this year. In the second half of this year, therefore, we can reasonably expect the surge of spending on pubs, clubs and holidays to boost economic activity to between 5 and 6% above underlying levels or around 4% above pre-crisis levels. Then as the surge eases, next year would see no GDP growth as underlying improvements in the economy are masked by the spending surge ending.

The employment story is very different. Furlough meant that Lockdown didn’t see forced job shedding, just the effects of firms not hiring or closing down. The employment rate fell by 1.6% compared to 10% for GDP. So, the employment fall has been in line with underlying lost output but not the extra driven by forcing firms not to trade and consumers not to consume. The surge will, however, boost employment rapidly. This is already appearing in the data and unemployment should be expected to return to pre-crisis levels by the end of the year.

What does this mean for government policy?

The crisis has seen government debt rise by 20% of GDP by the end of last year, when the current deficit was £65 billion in the final quarter. The coming surge in activity, ending of furlough and other crisis spending should mean that the current deficit should evaporate. The government should be looking to post a surplus by early next year. There will also be a reduction in the debt to GDP ratio because of the boost to growth from the spending surge. The government should be then keeping the deficit below the level of growth to reduce the debt burden slowly.

This still leaves the question of what to do about the large increase in debt over the last year? The answer is absolutely nothing.

The surge in activity addresses the current deficit and around 1/3 of the increase in historic debt levels has been funded by Quantitative Easing from the Bank of England. Which leaves the Bank holding one third of all government debt. There are lots of issues about how to manage these holdings, but these do not incur interest payments or require urgent financing. These holdings are a long-term issue which means that the functional debt is 2/3 of GDP, not 100%, and this level is manageable until we are firmly past the legacy of the Covid Crisis. This will help reduce the current government budget deficit and ease the historic debt concerns enough to not return to the austerity policies of George Osbourne. It still, of course, means little room for major spending boosts.

Lessons

The economic fallout from the Covid Crisis has been much less than feared last year and the release of excess savings, resulting from Lockdown, will create a temporary economic boom in the second half of this year. The limited economic damage reflects in large part the successful management of the economic fallout by the Chancellor and stands in massive contrast to the extremely poor handing of the health crisis itself.

The Chancellor has in effect used a major fiscal stimulus to overcome the effects of Lockdown. But more interestingly Furlough, the main spending ticket, acted as a highly targeted stimulus, focused on the hard-hit sectors. This then stopped leakage of reduced demand to other sectors. This high degree of targeting has been rather like the German Kurzarbeit, where firms in trouble in a recession can apply for government support to put workers on part-time working. Wages are then topped up by this support but fully, as with the 80% of wages paid under Furlough. The lessons then are: Fiscal stimulus works. That it should be targeted on jobs not consumption, through say VAT cuts. Finally, it should be targeted on stressed firms, sectors or other targeting devices and provide proportionately more support for lower waged jobs. It would be good to remember these lessons for the next recession, which is due in 2031[1].

 

[1] Recessions have occurred every 10 years on average since 1980

The ‘graduate parent’ advantage in teacher assessed grades

IOE Editor8 June 2021

By Jake Anders, Lindsey Macmillan, Patrick Sturgis, and Gill Wyness

Following a disastrous attempt to assign pupil grades using a controversial algorithm, last year’s GCSE and A level grades were eventually determined using Centre Assessed Grades (CAGs) following public outcry. Now, new evidence from a survey carried out by the UCL Centre for Education Policy and Equalising Opportunity (CEPEO) and the London School of Economics finds that some pupils appear to have benefited from an unfair advantage from this approach – particularly pupils with graduate parents. As teachers will again be deciding exam grades this year, this finding serves as an important warning of the challenges involved in ensuring that a system using teacher assessments is fair.

The decision to cancel formal exams in 2020 was taken at a late stage in the school year, meaning that there was little time for the government to develop a robust approach to assessment. After a short consultation, the Department for Education (DfE) decided that pupils’ exam grades would be determined by the teacher’s assessment of pupils’ grades, including their ranking. However, to prevent grade inflation due to teachers’ overpredicting their pupils, Ofqual then applied an algorithm to the rankings to calculate final grades, based on the historical results of the school.

A level pupils received their calculated grades on results days 2020, and although Ofqual reporting showed that the calculated grades were slightly higher than 2019 across the grade range, many pupils were devastated to find their teacher assessed grades had been lowered by the algorithm. More than a third of pupils received lower calculated grades than their original teacher assessed grades. Following widespread public outcry, the calculated grades were abandoned, and pupils were awarded the grades initially assessed by teachers. This inevitably led to significant grade inflation compared to previous cohorts.

This also created a unique situation where pupils received two sets of grades for their A levels – the calculated grades from the algorithm and the teacher allocated “centre assessed grades” or “CAGs”.

While it is now well established that CAGs were, on average, higher than the algorithm-calculated grades, less is known about the disparities between the two sets of grades for pupils from different backgrounds. Understanding these differences is important since it sheds light on whether some pupils received a larger boost from the move to teacher predicted CAGs, and hence to their future education and employment prospects. It is also, of course, relevant to this year’s grading process, as grades will again be allocated by teachers.

Administrative data on the differences between calculated grades and CAGs is not currently publicly available. However, findings from a new UKRI-funded survey of young people by the UCL Centre for Education Policy and Equalising Opportunity (CEPEO) and the London School of Economics (LSE) can help us to understand the issue. The survey provides representative data on over 4000 young people in England aged between 13 and 20, with interviews carried out online between November 2020 and January 2021.

Respondents affected by the A level exam cancellations (300 respondents) were asked whether their CAGs were higher or lower than their calculated grades. The resulting data reveal stark differences in the extent to which pupils were given a boost by the decision to revert to CAGs. As shown in Figure 1, pupils with graduate parents were 17 percentage points more likely to report that their CAGs were higher than their Ofqual calculated grades.  The survey data are linked to administrative data on prior attainment at Key Stages 2 and 4, as well as demographic and background characteristics such as, free school meals status, ethnicity, SEN and English as an additional language). Even after accounting for differences between pupils across these characteristics, those with graduate parents were still 15 percentage points more likely to report having higher CAGs than calculated grades.

Figure 1. The proportion of young people reporting their CAGs were better than their calculated grades by whether or not they report that one of their parents has a university degree (left panel: raw difference; right panel: adjusted for demographic characteristics and prior attainment)

 

There are a number of possible explanations for these differences. First, it could be that pupils with graduate parents are more likely to attend particular types of schools which have a greater tendency to ‘over-assess’ grades. While not directly relevant to this sample, an extreme version of this are the documented cases of independent schools deliberately over-assessing their pupils, but this could also happen in less dramatic and more unconscious ways. It could, for example, be more likely among schools that are used to predicting grades as part of the process for pupils applying to highly competitive university courses, where over-prediction may help more than it hurts.

A second possibility is that graduate parents are more likely to lobby their child’s school to ensure they receive favourable assessments. Such practices are reportedly becoming more common this year, with reports of “pointy elbowed” parents in affluent areas emailing teachers to attempt to influence their children’s GCSE and A level grades ahead of teacher assessed grades replacing exams this summer.

A third possibility is that the relatively high assessments enjoyed by those with graduate parents is a result of unconscious bias by teachers. A recent review by Ofqual found evidence of teacher biases in assessment, particularly against those from SEN and disadvantaged backgrounds, while a new study from Russia showed that teachers gave higher grades to pupils with more agreeable personalities. Interestingly, we found no differences between FSM and non-FSM pupils, perhaps suggesting teachers were careful not to treat FSM pupils differently. But they may nonetheless exhibit an unconscious positive bias towards pupils from backgrounds that tend to be associated with higher educational achievement.

Our results do not afford any leverage on which of these explanations, if any, is correct. Regardless of what is behind this systematic difference, our findings show that pupils with more educated parents received an unfair advantage in their A level results last year, with potential repercussions for equality and social mobility. They also highlight this is a substantial risk for this year’s process – perhaps even more so without the expectation of algorithmic moderation: grading pupils fairly in the absence of externally set and marked assessments is setting teachers an almost impossible task.

The working paper ‘Inequalities in young peoples’ education experiences and wellbeing during the Covid-19 pandemic’ is available here.

Learn more about our project on the impact of the pandemic on young people here.

Notes
The UKRI Covid-19 funded UCL CEPEO / LSE survey records information from a sample of 4,255 respondents, a subset of the 6,409 respondents who consented to recontact as part of the Wellcome Trust Science Education Tracker (SET) 2019 survey. The SET study was commissioned by Wellcome with additional funding from the Department for Education (DfE), UKRI, and the Royal Society. The original sample was a random sample of state school pupils in England, drawn from the National Pupil Database (NPD) and Individualised Learner Record (ILR). To correct for potentially systematic patterns of respondent attrition, non-response weights were calculated and applied to all analyses, aligning the sample profile with that of the original survey and the profile of young people in England.

This work is funded as part of the UKRI Covid-19 project ES/V013017/1 “Assessing the impact of Covid-19 on young peoples’ learning, motivation, wellbeing, and aspirations using a representative probability panel”.

This work was produced using statistical data hosted by ONS. The use of the ONS statistical data in this work does not imply the endorsement of the ONS in relation to the interpretation or analysis of the statistical data. This work uses research datasets which may not exactly reproduce National Statistics aggregates.

There can be no “levelling up” without education recovery

IOE Editor3 June 2021

This blog post first appeared on the University of Bristol Economics blog.

Simon Burgess, June 2021

Yesterday saw the resignation of Sir Kevan Collins, leading the Government’s Education Recovery Programme. The pandemic has hit young people very hard, causing significant learning losses and reduced mental health; the Recovery Programme is intended to rectify these harms and to repair the damage to pupils’ futures. His resignation letter labelled as inadequate the Government’s proposal: “I do not believe that it is credible that a successful recovery can be achieved with a programme of support of this size.”

The rejection of this programme, and the offer of a funding package barely a tenth of what is needed, is hard to understand. It is certainly not efficient: the cost of not rectifying the lost learning is vastly greater than the £15billion cost (discussed below). And it is manifestly unfair, for example when compared to the enormous expense incurred to look after older people like me. The vaccination programme is a colossal and brilliant public undertaking; we need something similar to protect the futures of young people. We have also seen educational inequality widen dramatically across social groups: children from poorer families have fallen yet further behind. If we do not have a properly funded educational recovery programme, any talk of “levelling up” is just noise.

Context – Education recovery after learning loss

An education recovery plan is urgently needed because of all the learning lost during school closures. For the first few months of the pandemic and the first round of school closures, we were restricted to just estimating the learning loss. Once pupils started back at school in September, data began to be collected from online assessment providers to actually measure the learning loss. The Education Endowment Foundation is very usefully collating these findings as they come in. The consensus is that the average loss of learning is around 2-3 months, with the most recent results the most worrying.  Within that average, the loss is much greater for students from disadvantaged backgrounds, and the loss is greater for younger pupils. To give only the most recent example, the latest data shows that schools with high fractions of disadvantaged kids saw falls in test scores twice as severe as those in low-poverty schools, and that Year 1 and Year 2 pupils experienced much larger falls in attainment. Government proposals for “Recovery” spending for precisely these pupils would be next to nothing, as Sir Kevan Collins notes in his Times article today: “The average primary school will directly receive just £6,000 per year, equivalent to £22 per child”.

The Government’s proposals amount to roughly £1 billion for more small-group tutoring and around £500m for teacher development and training. I am strongly in favour of small-group tutoring, but the issue is the scale: this is nowhere near enough. It is widely reported that Sir Kevan Collins’ estimate of what was required was £15 billion, based on a full analysis of the lost learning and the mental health and wellbeing deficits that both need urgent attention. For comparison, EPI helpfully provide these numbers on education recovery spending: the figure for England is equivalent to around £310 per pupil over three years, compared to £1,600 per pupil in the US, and £2,500 per pupil in the Netherlands.

Why might the programme have been rejected? Here are some arguments:

“It’s a lot of money”

It really isn’t. An investment of £15bn is dwarfed by the cost of not investing. Time in school increases a child’s cognitive ability, and prolonged periods of missed school have consequences for skill growth. We now know that a country’s level of skills has a strong (causal) effect on its economic growth rate. This is a very, very large scale problem: all of the 13 cohorts of pupils in school have lost skills because of school closures. So from the mid-2030s, all workers in their 20s will have significantly lower skills than they would otherwise have. And for the 40 years following that, between a third and a quarter of the entire workforce will have lower skills. Lost learning, lower skills, lower economic growth, lower tax revenues. Hanushek and Woessman, two highly distinguished economists, compute this value for a range of OECD countries. For the UK, assuming that the average amount of lost learning is about half a year, their results project the present discounted value of all the lost economic growth at roughly £2,150 billion (£2.15 trillion). Almost any policy will be worthwhile to mitigate such a loss.

“Kids are resilient and the lost learning will sort itself out”

This is simply wishful thinking. We should not be betting the futures of 7 million children on this basis. Economists estimate the way that skills are formed and one key attribute of this process can be summarised as “skills beget skills”. One of the first statements of this was Heckman and co-authors, and more recent researchers have confirmed this, and also using genetic data. This implies that if the level of skills has fallen to a lower level, then the future growth rate of skills will also be lower, assuming nothing else is done. It is also widely shown that early investments are particularly productive. Given these, we would expect pupils suffering significant learning losses to actually fall further behind rather than catch up. Sir Kevan Collins makes exactly this point in his resignation letter: “learning losses that are not addressed quickly are likely to compound”.

Perhaps catch-up can be achieved by pupils and parents working a bit harder at home? There is now abundant evidence from many countries including the UK that learning at home is only effective for some, typically more advantaged, families. For other families, it is not for want of trying or caring, but their lack of time, resources, skills and space makes it very difficult. The time for home learning to make up the lost learning was March 2020 through March 2021; if it was only patchily effective then, it will be less effective from now on.

“There’s no evidence to support these interventions”

This is simply not true, as I set out when recommending small-group tutoring last summer. There is abundant evidence that small-group tutoring is very effective in raising attainment. There is also strong evidence that lengthening the school day is also effective.

Conclusion

This blog is less scientifically cold and aloof than most that I write. I struggle to make sense of the government’s proposals to provide such a half-hearted, watered-down recovery programme, to value so lightly the permanent scar on pupil’s futures. The skills and learning of young people will not magically recover by itself; the multiple blows to mental health and wellbeing will not heal if ignored. The Government’s proposal appears to have largely abandoned them. To leave the final words to Sir Kevan Collins: I am concerned that the package announced today betrays an undervaluation of the importance of education, for individuals and as a driver of a more prosperous and healthy society.

Ethnicity Pay Gaps and Getting Stupid Answers

IOE Editor4 May 2021

By Paul Gregg

The old saying is that “If you ask a stupid question, you get a stupid answer”. The government-sponsored report from the Commission on Ethnic and Racial Disparities does just this on ethnic pay gaps. The central point is about comparing like-with-like when considering access to better-paying jobs in Britain. This blog post starts with a balanced assessment of what ethnic pay gaps in Britain actually look like, before explaining why the ONS analysis that the Commission draws on gets it so wrong.

Ethnic pay gaps from the Labour Force Survey

If we estimate the average (mean) pay gap between a Black, Asian, or Minority Ethnic (BAME) person and their White counterpart, living in the same region, and with similar educational achievement, using the nationally representative Labour Force Survey (LFS) of all with positive earnings, we find an ethnic pay gap of 14%. So similarly educated BAME people from the same place earn 14% less than White people. This is almost exactly the same pay gap as that found between men and women, and for those born into less advantaged families, compared to those born to more affluent families, again given the same educational achievement. The British labour market creates massive inequality of opportunity between people achieving the same education, across ethnicity, gender, and family background.

How does this compare with the Commission findings?

So the ethnic pay gap comparing like with like is 14%. So how on earth did the Commission come up with a 2.3% gap? There are two major parts to this.

The first is region people live in. The ONS report that the Commission draws on does not compare people in the same region. But ethnic minorities are not evenly spread across the country. They live disproportionately in London, the South East and major cities like Birmingham and Manchester. These are areas with higher pay but also higher living costs, especially in terms of housing costs. This 2.3% gap is comparing pay of BAME groups living in high-cost London to White populations living in low-cost Wales and the North East of England etc. This doesn’t make sense. One approach to make this more comparable would be to adjust for housing costs of where people live, but the easier approach is to compare BAME Brummies to White Brummies, and BAME Londoners to White Londoners – i.e. to compare BAME and White people living in the same region. Instead, this study gives a region-by-region breakdown of the ethnic pay gap, which is indicative of a pay gap between white and BAME groups, irrespective of where people live, of around 7%. This is one step closer to a balanced assessment but was not headline given by the Commission.

Well Paid Jobs

The second issue needs a little more explanation. Britain’s jobs have a wide distribution of pay levels. The minimum wage means that pay differences at the bottom are not that great. Pay of the person in the middle of the pay distribution was £13.68 Per Hour in 2020 (pre-pandemic). This is where ½ the employed population earn more and half less – the median.  Low paid people earn between £8.50 and £9 per hour (so a little above 60% of the median). One quarter earns more than 1.5 times this median figure, 10% earn more than 3 times this, and 5% more than 7 times. In other words, there are a small minority of jobs with extremely high pay. These are in law, business, and finance predominantly.

The ONS analysis which the Commission draws so heavily on, completely ignores access to these top jobs, because it measures pay gaps using– the pay gap between the person in the middle of the White earning distribution and the middle of the BAME one. This excludes differences in access to high paying jobs from the analysis. The average based on the mean (which is what all people think of as the average) rather than median, assesses the gap across all jobs. Doing this moves the pay gap from 7% or so for people in the same regions to 13%. Surely any assessment of disparities in opportunity would include access to the elite jobs in society as well as more typical jobs. It has to – to do otherwise is just stupid. The point is well made in the report in looking at BAME groups in the Civil Service (Figure 9, p12). Across departments as a whole, about 15% of staff are from BAME groups. But in senior roles, the number is half of this. Ethnic minorities of equal educational attainment systematically do not get opportunities leading to Britain’s higher-paying jobs.

Education

Educational achievement, as highlighted by the Commission report, has been a huge success story, educational levels in the BAME community are now a little higher than in the White population. Adjusting for this too, to compare Black and White with the same education to look at disparities in opportunities, pushes the pay gap up a little further to 14%. Comparing individuals with the same education, therefore, is making very little difference to the pay gap, as you would expect. The inequalities of opportunity lie beyond education in the labour market.

Britain’s ethnic minorities are well educated but are not progressing in the labour market to the highest paid jobs.  Yet a key report on ethnic disparities in opportunities chooses to assess pay gaps in a way that ignores this entirely. How stupid is that?

The challenges of COVID-19 for young people need a new cohort study: introducing COSMO 

IOE Editor23 April 2021

Jake Anders and Carl Cullinane 

The COVID-19 pandemic and its impact is a generation-defining challengeOne of its most concerning aspects, particularly in the long term, is the already profound effect it has had on young people’s lives. Disruption to their development, wellbeing and education could have substantial, long-lasting effects on later life chances, particularly for those from lower-income homesEvidence is already showing disadvantaged pupils lagging 5 months behind their peers. This poses a unique challenge for educational policy and practice, with the scale of the disruption requiring solutions to match that scale.  

In order to address these impacts, it is vital that we fully understand these effects, and in particular, the disproportionate burden falling on those from certain groups, including those from lower socio-economic backgrounds and minority ethnic groups. This needs high-quality data. Recovering from the effects of the past 12 months will be a longterm project, and to reflect this we need research of similar ambition. 

The COVID Social Mobility and Opportunity Study (COSMO for short), launched today, seeks to play this role, harnessing the power of longitudinal research to capture the experiences of a cohort of young people for whom the pandemic has had an acute impact, and its effects on their educational and career trajectories. 

This country has a grand tradition of cohort studiesincluding the pioneering 1958 National Child Development Study and the 1970 British Cohort StudySuch studies are a key tool in understanding life trajectories and the complex factors that shape them. And they are particularly vital when it comes to measuring the impact of events that are likely to last through someone’s life course. The existing longitudinal studies, including those run by our colleagues in the UCL Centre for Longitudinal Studies, have played a huge role in understanding the impacts of the pandemic on society in the last year. 

But there is a key gap in the current portfolio of cohort studies: and that is the generation of young people at the sharp end of their school education, who would have taken GCSEs this summer, and within a matter of months will be moving onto new pathways at sixth form, further education, traineeships and apprenticeships. The impacts on this group are likely to be profound and long-lasting, and understanding the complex elements that have aggravated or mitigated these impacts is crucial. 

A variety of studies have already collected some such data, providing emerging evidence of inequalities in pupils’ outcomes and experiences of remote schooling. This has highlighted alarming challenges for pupils’ learning and wellbeing. However, to develop a full understanding we require the combination of rich, representative, survey data on topics such as learning loss experiences, wellbeing, and aspirations, linked with administrative data on educational outcomes, and concurrent interventions. We also need to follow up those young people over the next few years as they pass through key stages of education and their early career, to understand what has happened next, ideally long into their working lives. 

Such evidence will be key in shaping policies that can help to alleviate the longterm impacts on young people. Which groups have suffered most and how, how long will these impacts persist, and how can we reduce their effect. These will be fundamental questions for national policymakers, education providers, employers and third sector organisations in the coming years, both in the UK and internationally. 

That’s why we’re extremely excited to be launching COSMO with funding from UK Research and Innovation (UKRI)Ideas to Address COVID-19 response fundOur study will deliver exactly that data over the coming years, helping to inform future policy interventions that will be required, given that the huge effects of the pandemic are only just beginning. As the British Academy pointed out on the anniversary of the first COVID lockdown – this is not going to go away quickly. 

Beginning this autumn, the study will recruit a representative sample of 12,000 current Year 11 pupils across England, with sample boosts for disadvantaged and ethnic minority groups plus targeting of other hard-to-reach groups. Young person, parent, and school questionnaires – enhanced with administrative data from DfE– will collect rich data on young people’s experiences of education and wellbeing during the past challenging 12 months, along with information on their transitions into post-16 pathways via this summer’s unusual GCSE assessment process. 

The study is a collaboration between the UCL Centre for Education Policy & Equalising Opportunities (CEPEO), the UCL Centre for Longitudinal Studies (CLS) and the Sutton Trust. The study will harness CEPEO’s cutting-edge research focused on equalising opportunities across the life course, seeking to improve education policy and wider practices to achieve this goal. The Sutton Trust also brings 25 years of experience using research to inform the public and achieve policy change in the area of social mobility.  

COSMO will also be part of the family of cohort studies housed in the UCL Centre for Longitudinal Studies, whose expertise in life course research is world-renowned. We are also working closely with Kantar Publicwho will lead on delivering the fieldwork for this large study, alongside NatCen Social Research. More broadly still, all our work will be co-produced with project stakeholders including the Department for Education and the Office for Students. We are also working with partners in Scotland and Wales to maximise comparability across the nations. 

We are excited for COSMO to make a big contribution both to the landscape of educational research and to the post-pandemic policy environment, and we are delighted to be getting to work delivering on this promise over the coming years. 

We won’t reduce inequalities in post-16 progression until we make ‘lower attainers’ more visible

IOE Editor29 March 2021

By Ruth Lupton, Stephanie Thomson, Lorna Unwin and Sanne Velthuis

Inequalities in post-16 progression

The continued use of GCSEs as a blunt instrument for dividing pre-and post-16 education is one of the main causes of inequality in the English system, with impacts extending well into adulthood. The system asks the least confident, least academically successful young people, often (but not always) facing greater social and economic disadvantages, to make the most complex, life-shaping choices at the youngest age.  Contemporaries with high academic attainment can progress more straightforwardly in a simpler, better understood, and historically better-funded system, often postponing decisions about occupational directions until age 18, 19 or later.

In our new research, funded by the Nuffield Foundation, we investigated the post-16 trajectories of young people who we described as ‘lower attainers’ – the 40% of each GCSE cohort who annually do not achieve a grade 4 (formerly C) in both English and maths.  We presented our findings at a recent CEPEO webinar.

Our research employed a mixed-methods approach combining analysis of data from the National Pupil Database (NPD) and Individualised Learner Record (ILR), collection and analysis of local data about course and apprenticeship opportunities and entry requirements, and interviews and focus groups.

It shows how, in making the transition to the post-16 phase and attempting to progress beyond GCSEs, ‘lower attainers’ face multiple barriers including: inconsistent careers information and guidance; restrictive entry requirements that are often based on English and maths GCSEs (even when it is not clear why specific grades are needed); considerable local variation in accessible provision; and the low availability and poor visibility of apprenticeships. Apprenticeships are not the accessible pathway for ‘lower attainers’ that many people imagine, with only 5.8% moving into an apprenticeship at 16 in the 2015 cohort, for example.

It also shows that many young people start their post-16 phase on courses below the levels of learning they have already achieved and that learners with similar attainment at 16 enter the post-16 phase at different levels in different places, partly due to local differences in the mix of provision and institutional practices. This has potential repercussions for the achievement of Level 2 and Level 3 qualifications between 16 and 18/19.

Making the problems and solutions more visible

All this points to a complex and locally variable picture that needs to be better understood.  But achieving clarity and understanding is very difficult due to the way attainment is measured and administrative data is collected, organised and made accessible.

Published statistics do not make the achievements and trajectories of lower attaining young people very visible, probably because much of the policy focus to date has been on raising KS4 attainment at the standard benchmarks. Coverage of lower-level qualifications (and of spatial variations) still lags behind.

And beyond the published statistics, there are major problems with the capacity for detailed analysis of the underlying data.

One issue is the data itself. Currently, we have two different large-scale administrative datasets for the post-16 phase – the NPD and ILR – with different definitions, variables and standards of documentation, and including different learners.  Getting access to these involves a lengthy and difficult application procedure, and working with the data to summarise what learners are doing and achieving is a painstaking process. Looking at academic routes is easier than tracking routes through vocational courses and apprenticeships because matching NPD (Key Stage 4) to NPD (Key Stage 5) is easier than matching NPD to ILR.  It is easier to look at outcomes than it is to understand progress and what learners are actually doing.  So analysis often focuses on qualifications achieved as the data is collected in this way.  We tried a different approach.  We developed a measure of a learner’s ‘main level of learning’ – the level that they were spending most of their guided learning hours on – and thus were able to illuminate progression (or not) from levels already achieved.  If the data sources were easier to access and use, much more could be done to analyse and explain course changes and progression between 16 and 19 and to understand what constitutes success and progress.

At a local level, basic information on the system in terms of the nature of provision at any given time as well as associated entry requirements is not routinely collected. To shed light on these issues, we had to collect and aggregate this information from provider and national agency websites, a labour-intensive task. The lack of available data leaves policy-makers unsighted as to what is on offer, who is missing out, and which gaps need to be plugged.

The other issue is analytic capacity.  Even if there were better data, there is a paucity of academics with interests and expertise in further education and training compared with the numbers working on school and higher education research. And we need more research teams who can combine quantitative and qualitative methods to investigate the relationship between the pre and post-16 phases. Changing this now will require not just funding for projects and centres but investment in early-career scholarship, addressing status issues and links to teaching. And there are insufficient links between people who have the skills for data analysis and practitioners who understand how the system works on the ground. Cuts to local authority funding have further diminished local capacity and intelligence.

Thus, if the characteristics and trajectories of lower attainers at GCSE are to be better understood on an ongoing basis, three substantial changes will need to be made:

  • Routine reporting of sub-benchmark achievement in more detail, and at relevant subnational scales.
  • Improvement in data infrastructure and access.
  • Increase in research and analysis capacity, both in local government and in universities and research institutes, and better links between them.

These will not be cheap.  But if the government is serious about eroding the long-standing inequalities in post-16 progression, it simply must invest in making the situation more visible.

The research reported here was funded by the Nuffield Foundation, but the views expressed are those of the authors and not necessarily the Foundation. Visit www.nuffieldfoundation.org

How is school accountability linked to teacher stress?

IOE Editor18 March 2021

England’s school system has a high level of accountability – and a high level of accountability-related stress.

By John Jerrim

This blog post reports findings from Nuffield Foundation-funded research conducted into teacher health and wellbeing.

It is no secret that many in education dislike certain aspects of England’s school accountability system. Indeed, accountability is often blamed for causing high-levels of stress among the teacher workforce.

Yet we know surprisingly little about the link between accountability and teacher wellbeing.

This blogpost – based upon a new research paper I am publishing with colleagues today – looks at international evidence on this issue from TALIS 2018. (TALIS is the OECD’s Teaching and Learning International Survey.)

Do high accountability school systems have teachers who are more stressed about this aspect of their job?

As part of TALIS, teachers were asked how much stress was caused by different aspects of their job. This included “being held responsible for pupil achievement” – i.e. accountability.

In another international survey, PISA, headteachers were asked various questions about accountability, such as how school assessment data is used, whether school examination results are made publicly available (e.g. school league tables) and if there is a school inspectorate (e.g. Ofsted).

Using this data, we have created a “school accountability” scale, capturing the extent of school accountability systems used across the world. Countries receive a score between -1 and +1, where a higher number corresponds to more accountability measures.

In the chart below, the extent of accountability in the school system is plotted along the horizontal axis and the percentage of teachers who feel stressed about accountability on the vertical axis.

There are two key points of note.

First, England sits towards the top-right hand corner: we have lots of accountability in our school system, and also a lot of accountability-driven stress among teachers. (68% of teachers in England report feeling accountability-related stress, compared to a cross-country average of around 45%).

Second, there is a positive cross-national correlation, though this is relatively weak (the correlation coefficient is around 0.3). In other words, teachers do tend to be more stressed about accountability in countries where there is more accountability within the school system. Yet this relationship is not that strong – and certainly not deterministic.

For instance, there are countries with systems of school accountability similar in extent to England’s – most notably New Zealand and the United States – where teachers are a lot less likely to be stressed by this part of their job.

Now, as I have written before, results from such cross-national analyses need to be treated very carefully. The chart above should be treated as a conversation starter, rather than being used as ‘proof’ of anything more.

It does nevertheless raise important questions about the pros and cons of England’s current system of school accountability. In particular, do we have the right balance between quality assurance of schools and ensuring that this does not stress teaching staff out?

How is accountability-induced stress among teachers linked to the stress felt by headteachers?

Within our paper, we also consider how stress-induced by accountability is shared among staff within the same school.

For instance, do teachers feel more stressed about accountability when their boss – headteachers – feel stressed about this part of the job as well?

As the chart below, which relates to all TALIS countries, indicates, the answer is to some extent ‘yes’. Specifically, in comparison to teachers whose head does not feel stressed by accountability at all, teachers are around seven percentage points more likely to feel stressed by accountability if their headteacher says they feel ‘a lot’ of stress about this part of their job as well. To put this figure into context, on average across countries, approximately 45% of teachers say that they feel stressed by accountability.

So, there is indeed a relationship. But the difference is not particularly strong.

Accountability-induced stress is – to some extent – concentrated within particular schools.

Our analysis has also considered whether teachers are more likely to feel stressed about accountability if their colleagues (i.e. other teachers within their school) feel stressed by accountability as well.

Here, we found strong evidence of a positive relationship. For instance, again looking across all TALIS countries, a teacher is twice as likely to say that they feel stressed by accountability if their colleagues also feel stressed by this part of their job.

In other words, there are some schools where the stress caused by accountability is a particularly big problem that needs to be addressed.

We still need to know much more

So, England is a high-accountability, high-accountability stress country. We know there is a modest link between the stress of headteachers and the stress of their staff. And, to some extent, the problem of accountability-induced stress is clustered among teachers working within specific schools.

Yet, for all the talk about how school league tables and Ofsted inspections negatively affect teachers, we still know relatively little about the pros and cons of England’s extensive system of school accountability.

With the recent pause in many aspects of the school accountability system in England due to the Covid-19 crisis, now could be the ideal time for policymakers to take a moment and consider whether we have the right quality assurance mechanisms in place within our schools.

The project has been funded by the Nuffield Foundation, but the views expressed are those of the authors and not necessarily the Foundation. Visit www.nuffieldfoundation.org.