X Close

Centre for Education Policy and Equalising Opportunities (CEPEO)

Home

We create research to improve the education system and equalise opportunities for all.

Menu

Archive for the 'Tertiary education' Category

A-levelling up: the thorny path back from teacher assessed grades

Blog Editor12 August 2021

By Jake Anders, Claire Crawford, and Gill Wyness
This piece first appeared on theGuardian.com.

This week’s GCSE and A level results confirmed the expectations of many who study education policy: the proportion of students achieving top grades in these qualifications has increased substantially compared to 2019, especially at A level. Students themselves should be extremely proud of their results, which were achieved under very difficult circumstances. Likewise, teachers have worked extremely hard to make the best assessment they can of their pupils’ performance. But there is no getting around the fact that these results are different – and not directly comparable with – pre-Covid results.

It is right to allow for the fact that students taking GCSEs and A levels this year and last are at a disadvantage compared to previous cohorts. In-person exams would have been next to impossible in 2020, and those assessed this year have missed significant amounts of schooling.

To deal with this, the government chose an entirely different means of measuring performance: teacher assessments. (We advocated a different approach, based on more flexible exams, in 2021.) This year’s approach has been rather more orderly than last year’s chaos, but the wide range of measures that teachers could consider – such as mock exams, in-class tests and coursework – inevitably led to variation in how schools assessed their pupils.

This year’s grades may also be capturing average or ‘best’ performance across a range of pieces of work, rather than a snapshot from one or two exams. This seems to have been particularly true at A level, where grades have immediate consequences for university entry decisions. In short, it is unsurprising that grades based on teacher assessment are higher than those based on exams alone: while some have called this grade inflation we think it’s more accurate to say that they are capturing different information.

A level grade distribution in 2019, 2020 and 2021

But given they have been presented on the same scale, the stark increase in grades compared to pre-Covid times present significant challenges for current and future cohorts.

Even making comparisons between pupils within the 2021 cohort may be challenging. Using teacher assessment is likely to have disadvantaged some students relative to others. Previous research has shown that Black Caribbean pupils are more likely than white pupils to receive a grade from their teacher below their score in an externally marked test taken at the same time. Similarly, girls have also been found to perform better at coursework, while boys do better at exams on average. Differences by gender have been particularly apparent this year, with girls seeing larger improvements in performance than boys compared to pre-pandemic.

This year’s record high scores raise challenging questions. The much larger proportion of pupils getting As and A*s at A level, for example, may lead to universities relying more heavily on alternative methods of distinguishing between applicants – such as personal statements – which have been shown to entrench (dis)advantage.

There is also the all-important question of what to do next year: are this year’s grade distributions the right starting point, or should we be looking to return to something closer to the 2019 distribution? Is it possible to go back? And would we want to?

Assuming in-person exams are feasible next year, one possibility would be to return to 2019’s system as if nothing had happened. This would probably see substantial reductions in the proportion of students getting top grades, especially at A level. One can only imagine the political challenge of trying to do this.

Even more important is that the next cohorts of GCSE and A level students (and indeed the ones that follows – we are tracking the experiences of those taking GCSEs this year as part of a new UKRI-funded cohort study, COSMO) have also been affected by the pandemic, arguably to a greater degree than this year’s. They are therefore likely to underperform their potential and get lower grades than cohorts who took their exams before the pandemic struck. That is clearly not desirable.

It is important to continue making allowances for the exceptional circumstances young people have faced during this crucial time in their education. During the period affected by pandemic learning loss, our suggestion would be to design exams with more flexibility, allowing candidates to choose which questions to answer based on their strengths, as is common in university exams. This would enable a return to the fairest way to assess students – exams – while still taking account of lost learning.

Either way, any return to exam-based grades is likely to result in an immediate pronounced drop in results compared to the last two years, especially at A level. Gavin Williamson has suggested that the government will aim instead for a “glide path back to a more normal state of affairs”. This would smooth out the unfairness of sharp discontinuities between cohorts. But it would mean moving away from grades being based on the same standard over time, instead setting quotas of students allowed to achieve each grade, gradually reducing the higher grades and increasing the lower ones. Even if that seems a good plan now, it would be very hard to stick to: the fall-out from the small reduction in pass rates seen in Scotland this week would be a taste of things to come for years.

A more radical possibility would be to reset the grading system entirely. This would get around the political issue of there being very large or deliberate small falls in grades for future cohorts, but one wonders whether this is the right time to undertake such a drastic overhaul. The pandemic will have repercussions on young people’s grades for years to come: is the best approach really a total reset right now?

The question of what to do next is one that policymakers will have to grapple with over the coming months and years. Of more fundamental importance and urgency, however, is that pupils have experienced widespread learning losses due to the pandemic – regardless of what their grades show – and are likely to be affected by these for years. Students require ongoing support throughout the rest of their educational careers, including catch up support throughout school, college and university.

We cannot simply award them GCSE and A level grades that try to look past the learning they have lost and move on – the learning loss remains and must be addressed.

Dr Gill Wyness & Dr Jake Anders are deputy directors of the UCL Centre for Education Policy & Equalising Opportunities (CEPEO). Dr Claire Crawford is an associate professor at CEPEO.

The ‘graduate parent’ advantage in teacher assessed grades

IOE Editor8 June 2021

By Jake Anders, Lindsey Macmillan, Patrick Sturgis, and Gill Wyness

Following a disastrous attempt to assign pupil grades using a controversial algorithm, last year’s GCSE and A level grades were eventually determined using Centre Assessed Grades (CAGs) following public outcry. Now, new evidence from a survey carried out by the UCL Centre for Education Policy and Equalising Opportunity (CEPEO) and the London School of Economics finds that some pupils appear to have benefited from an unfair advantage from this approach – particularly pupils with graduate parents. As teachers will again be deciding exam grades this year, this finding serves as an important warning of the challenges involved in ensuring that a system using teacher assessments is fair.

The decision to cancel formal exams in 2020 was taken at a late stage in the school year, meaning that there was little time for the government to develop a robust approach to assessment. After a short consultation, the Department for Education (DfE) decided that pupils’ exam grades would be determined by the teacher’s assessment of pupils’ grades, including their ranking. However, to prevent grade inflation due to teachers’ overpredicting their pupils, Ofqual then applied an algorithm to the rankings to calculate final grades, based on the historical results of the school.

A level pupils received their calculated grades on results days 2020, and although Ofqual reporting showed that the calculated grades were slightly higher than 2019 across the grade range, many pupils were devastated to find their teacher assessed grades had been lowered by the algorithm. More than a third of pupils received lower calculated grades than their original teacher assessed grades. Following widespread public outcry, the calculated grades were abandoned, and pupils were awarded the grades initially assessed by teachers. This inevitably led to significant grade inflation compared to previous cohorts.

This also created a unique situation where pupils received two sets of grades for their A levels – the calculated grades from the algorithm and the teacher allocated “centre assessed grades” or “CAGs”.

While it is now well established that CAGs were, on average, higher than the algorithm-calculated grades, less is known about the disparities between the two sets of grades for pupils from different backgrounds. Understanding these differences is important since it sheds light on whether some pupils received a larger boost from the move to teacher predicted CAGs, and hence to their future education and employment prospects. It is also, of course, relevant to this year’s grading process, as grades will again be allocated by teachers.

Administrative data on the differences between calculated grades and CAGs is not currently publicly available. However, findings from a new UKRI-funded survey of young people by the UCL Centre for Education Policy and Equalising Opportunity (CEPEO) and the London School of Economics (LSE) can help us to understand the issue. The survey provides representative data on over 4000 young people in England aged between 13 and 20, with interviews carried out online between November 2020 and January 2021.

Respondents affected by the A level exam cancellations (300 respondents) were asked whether their CAGs were higher or lower than their calculated grades. The resulting data reveal stark differences in the extent to which pupils were given a boost by the decision to revert to CAGs. As shown in Figure 1, pupils with graduate parents were 17 percentage points more likely to report that their CAGs were higher than their Ofqual calculated grades.  The survey data are linked to administrative data on prior attainment at Key Stages 2 and 4, as well as demographic and background characteristics such as, free school meals status, ethnicity, SEN and English as an additional language). Even after accounting for differences between pupils across these characteristics, those with graduate parents were still 15 percentage points more likely to report having higher CAGs than calculated grades.

Figure 1. The proportion of young people reporting their CAGs were better than their calculated grades by whether or not they report that one of their parents has a university degree (left panel: raw difference; right panel: adjusted for demographic characteristics and prior attainment)

 

There are a number of possible explanations for these differences. First, it could be that pupils with graduate parents are more likely to attend particular types of schools which have a greater tendency to ‘over-assess’ grades. While not directly relevant to this sample, an extreme version of this are the documented cases of independent schools deliberately over-assessing their pupils, but this could also happen in less dramatic and more unconscious ways. It could, for example, be more likely among schools that are used to predicting grades as part of the process for pupils applying to highly competitive university courses, where over-prediction may help more than it hurts.

A second possibility is that graduate parents are more likely to lobby their child’s school to ensure they receive favourable assessments. Such practices are reportedly becoming more common this year, with reports of “pointy elbowed” parents in affluent areas emailing teachers to attempt to influence their children’s GCSE and A level grades ahead of teacher assessed grades replacing exams this summer.

A third possibility is that the relatively high assessments enjoyed by those with graduate parents is a result of unconscious bias by teachers. A recent review by Ofqual found evidence of teacher biases in assessment, particularly against those from SEN and disadvantaged backgrounds, while a new study from Russia showed that teachers gave higher grades to pupils with more agreeable personalities. Interestingly, we found no differences between FSM and non-FSM pupils, perhaps suggesting teachers were careful not to treat FSM pupils differently. But they may nonetheless exhibit an unconscious positive bias towards pupils from backgrounds that tend to be associated with higher educational achievement.

Our results do not afford any leverage on which of these explanations, if any, is correct. Regardless of what is behind this systematic difference, our findings show that pupils with more educated parents received an unfair advantage in their A level results last year, with potential repercussions for equality and social mobility. They also highlight this is a substantial risk for this year’s process – perhaps even more so without the expectation of algorithmic moderation: grading pupils fairly in the absence of externally set and marked assessments is setting teachers an almost impossible task.

The working paper ‘Inequalities in young peoples’ education experiences and wellbeing during the Covid-19 pandemic’ is available here.

Learn more about our project on the impact of the pandemic on young people here.

Notes
The UKRI Covid-19 funded UCL CEPEO / LSE survey records information from a sample of 4,255 respondents, a subset of the 6,409 respondents who consented to recontact as part of the Wellcome Trust Science Education Tracker (SET) 2019 survey. The SET study was commissioned by Wellcome with additional funding from the Department for Education (DfE), UKRI, and the Royal Society. The original sample was a random sample of state school pupils in England, drawn from the National Pupil Database (NPD) and Individualised Learner Record (ILR). To correct for potentially systematic patterns of respondent attrition, non-response weights were calculated and applied to all analyses, aligning the sample profile with that of the original survey and the profile of young people in England.

This work is funded as part of the UKRI Covid-19 project ES/V013017/1 “Assessing the impact of Covid-19 on young peoples’ learning, motivation, wellbeing, and aspirations using a representative probability panel”.

This work was produced using statistical data hosted by ONS. The use of the ONS statistical data in this work does not imply the endorsement of the ONS in relation to the interpretation or analysis of the statistical data. This work uses research datasets which may not exactly reproduce National Statistics aggregates.

There can be no “levelling up” without education recovery

IOE Editor3 June 2021

This blog post first appeared on the University of Bristol Economics blog.

Simon Burgess, June 2021

Yesterday saw the resignation of Sir Kevan Collins, leading the Government’s Education Recovery Programme. The pandemic has hit young people very hard, causing significant learning losses and reduced mental health; the Recovery Programme is intended to rectify these harms and to repair the damage to pupils’ futures. His resignation letter labelled as inadequate the Government’s proposal: “I do not believe that it is credible that a successful recovery can be achieved with a programme of support of this size.”

The rejection of this programme, and the offer of a funding package barely a tenth of what is needed, is hard to understand. It is certainly not efficient: the cost of not rectifying the lost learning is vastly greater than the £15billion cost (discussed below). And it is manifestly unfair, for example when compared to the enormous expense incurred to look after older people like me. The vaccination programme is a colossal and brilliant public undertaking; we need something similar to protect the futures of young people. We have also seen educational inequality widen dramatically across social groups: children from poorer families have fallen yet further behind. If we do not have a properly funded educational recovery programme, any talk of “levelling up” is just noise.

Context – Education recovery after learning loss

An education recovery plan is urgently needed because of all the learning lost during school closures. For the first few months of the pandemic and the first round of school closures, we were restricted to just estimating the learning loss. Once pupils started back at school in September, data began to be collected from online assessment providers to actually measure the learning loss. The Education Endowment Foundation is very usefully collating these findings as they come in. The consensus is that the average loss of learning is around 2-3 months, with the most recent results the most worrying.  Within that average, the loss is much greater for students from disadvantaged backgrounds, and the loss is greater for younger pupils. To give only the most recent example, the latest data shows that schools with high fractions of disadvantaged kids saw falls in test scores twice as severe as those in low-poverty schools, and that Year 1 and Year 2 pupils experienced much larger falls in attainment. Government proposals for “Recovery” spending for precisely these pupils would be next to nothing, as Sir Kevan Collins notes in his Times article today: “The average primary school will directly receive just £6,000 per year, equivalent to £22 per child”.

The Government’s proposals amount to roughly £1 billion for more small-group tutoring and around £500m for teacher development and training. I am strongly in favour of small-group tutoring, but the issue is the scale: this is nowhere near enough. It is widely reported that Sir Kevan Collins’ estimate of what was required was £15 billion, based on a full analysis of the lost learning and the mental health and wellbeing deficits that both need urgent attention. For comparison, EPI helpfully provide these numbers on education recovery spending: the figure for England is equivalent to around £310 per pupil over three years, compared to £1,600 per pupil in the US, and £2,500 per pupil in the Netherlands.

Why might the programme have been rejected? Here are some arguments:

“It’s a lot of money”

It really isn’t. An investment of £15bn is dwarfed by the cost of not investing. Time in school increases a child’s cognitive ability, and prolonged periods of missed school have consequences for skill growth. We now know that a country’s level of skills has a strong (causal) effect on its economic growth rate. This is a very, very large scale problem: all of the 13 cohorts of pupils in school have lost skills because of school closures. So from the mid-2030s, all workers in their 20s will have significantly lower skills than they would otherwise have. And for the 40 years following that, between a third and a quarter of the entire workforce will have lower skills. Lost learning, lower skills, lower economic growth, lower tax revenues. Hanushek and Woessman, two highly distinguished economists, compute this value for a range of OECD countries. For the UK, assuming that the average amount of lost learning is about half a year, their results project the present discounted value of all the lost economic growth at roughly £2,150 billion (£2.15 trillion). Almost any policy will be worthwhile to mitigate such a loss.

“Kids are resilient and the lost learning will sort itself out”

This is simply wishful thinking. We should not be betting the futures of 7 million children on this basis. Economists estimate the way that skills are formed and one key attribute of this process can be summarised as “skills beget skills”. One of the first statements of this was Heckman and co-authors, and more recent researchers have confirmed this, and also using genetic data. This implies that if the level of skills has fallen to a lower level, then the future growth rate of skills will also be lower, assuming nothing else is done. It is also widely shown that early investments are particularly productive. Given these, we would expect pupils suffering significant learning losses to actually fall further behind rather than catch up. Sir Kevan Collins makes exactly this point in his resignation letter: “learning losses that are not addressed quickly are likely to compound”.

Perhaps catch-up can be achieved by pupils and parents working a bit harder at home? There is now abundant evidence from many countries including the UK that learning at home is only effective for some, typically more advantaged, families. For other families, it is not for want of trying or caring, but their lack of time, resources, skills and space makes it very difficult. The time for home learning to make up the lost learning was March 2020 through March 2021; if it was only patchily effective then, it will be less effective from now on.

“There’s no evidence to support these interventions”

This is simply not true, as I set out when recommending small-group tutoring last summer. There is abundant evidence that small-group tutoring is very effective in raising attainment. There is also strong evidence that lengthening the school day is also effective.

Conclusion

This blog is less scientifically cold and aloof than most that I write. I struggle to make sense of the government’s proposals to provide such a half-hearted, watered-down recovery programme, to value so lightly the permanent scar on pupil’s futures. The skills and learning of young people will not magically recover by itself; the multiple blows to mental health and wellbeing will not heal if ignored. The Government’s proposal appears to have largely abandoned them. To leave the final words to Sir Kevan Collins: I am concerned that the package announced today betrays an undervaluation of the importance of education, for individuals and as a driver of a more prosperous and healthy society.

The challenges of COVID-19 for young people need a new cohort study: introducing COSMO 

IOE Editor23 April 2021

Jake Anders and Carl Cullinane 

The COVID-19 pandemic and its impact is a generation-defining challengeOne of its most concerning aspects, particularly in the long term, is the already profound effect it has had on young people’s lives. Disruption to their development, wellbeing and education could have substantial, long-lasting effects on later life chances, particularly for those from lower-income homesEvidence is already showing disadvantaged pupils lagging 5 months behind their peers. This poses a unique challenge for educational policy and practice, with the scale of the disruption requiring solutions to match that scale.  

In order to address these impacts, it is vital that we fully understand these effects, and in particular, the disproportionate burden falling on those from certain groups, including those from lower socio-economic backgrounds and minority ethnic groups. This needs high-quality data. Recovering from the effects of the past 12 months will be a longterm project, and to reflect this we need research of similar ambition. 

The COVID Social Mobility and Opportunity Study (COSMO for short), launched today, seeks to play this role, harnessing the power of longitudinal research to capture the experiences of a cohort of young people for whom the pandemic has had an acute impact, and its effects on their educational and career trajectories. 

This country has a grand tradition of cohort studiesincluding the pioneering 1958 National Child Development Study and the 1970 British Cohort StudySuch studies are a key tool in understanding life trajectories and the complex factors that shape them. And they are particularly vital when it comes to measuring the impact of events that are likely to last through someone’s life course. The existing longitudinal studies, including those run by our colleagues in the UCL Centre for Longitudinal Studies, have played a huge role in understanding the impacts of the pandemic on society in the last year. 

But there is a key gap in the current portfolio of cohort studies: and that is the generation of young people at the sharp end of their school education, who would have taken GCSEs this summer, and within a matter of months will be moving onto new pathways at sixth form, further education, traineeships and apprenticeships. The impacts on this group are likely to be profound and long-lasting, and understanding the complex elements that have aggravated or mitigated these impacts is crucial. 

A variety of studies have already collected some such data, providing emerging evidence of inequalities in pupils’ outcomes and experiences of remote schooling. This has highlighted alarming challenges for pupils’ learning and wellbeing. However, to develop a full understanding we require the combination of rich, representative, survey data on topics such as learning loss experiences, wellbeing, and aspirations, linked with administrative data on educational outcomes, and concurrent interventions. We also need to follow up those young people over the next few years as they pass through key stages of education and their early career, to understand what has happened next, ideally long into their working lives. 

Such evidence will be key in shaping policies that can help to alleviate the longterm impacts on young people. Which groups have suffered most and how, how long will these impacts persist, and how can we reduce their effect. These will be fundamental questions for national policymakers, education providers, employers and third sector organisations in the coming years, both in the UK and internationally. 

That’s why we’re extremely excited to be launching COSMO with funding from UK Research and Innovation (UKRI)Ideas to Address COVID-19 response fundOur study will deliver exactly that data over the coming years, helping to inform future policy interventions that will be required, given that the huge effects of the pandemic are only just beginning. As the British Academy pointed out on the anniversary of the first COVID lockdown – this is not going to go away quickly. 

Beginning this autumn, the study will recruit a representative sample of 12,000 current Year 11 pupils across England, with sample boosts for disadvantaged and ethnic minority groups plus targeting of other hard-to-reach groups. Young person, parent, and school questionnaires – enhanced with administrative data from DfE– will collect rich data on young people’s experiences of education and wellbeing during the past challenging 12 months, along with information on their transitions into post-16 pathways via this summer’s unusual GCSE assessment process. 

The study is a collaboration between the UCL Centre for Education Policy & Equalising Opportunities (CEPEO), the UCL Centre for Longitudinal Studies (CLS) and the Sutton Trust. The study will harness CEPEO’s cutting-edge research focused on equalising opportunities across the life course, seeking to improve education policy and wider practices to achieve this goal. The Sutton Trust also brings 25 years of experience using research to inform the public and achieve policy change in the area of social mobility.  

COSMO will also be part of the family of cohort studies housed in the UCL Centre for Longitudinal Studies, whose expertise in life course research is world-renowned. We are also working closely with Kantar Publicwho will lead on delivering the fieldwork for this large study, alongside NatCen Social Research. More broadly still, all our work will be co-produced with project stakeholders including the Department for Education and the Office for Students. We are also working with partners in Scotland and Wales to maximise comparability across the nations. 

We are excited for COSMO to make a big contribution both to the landscape of educational research and to the post-pandemic policy environment, and we are delighted to be getting to work delivering on this promise over the coming years. 

We won’t reduce inequalities in post-16 progression until we make ‘lower attainers’ more visible

IOE Editor29 March 2021

By Ruth Lupton, Stephanie Thomson, Lorna Unwin and Sanne Velthuis

Inequalities in post-16 progression

The continued use of GCSEs as a blunt instrument for dividing pre-and post-16 education is one of the main causes of inequality in the English system, with impacts extending well into adulthood. The system asks the least confident, least academically successful young people, often (but not always) facing greater social and economic disadvantages, to make the most complex, life-shaping choices at the youngest age.  Contemporaries with high academic attainment can progress more straightforwardly in a simpler, better understood, and historically better-funded system, often postponing decisions about occupational directions until age 18, 19 or later.

In our new research, funded by the Nuffield Foundation, we investigated the post-16 trajectories of young people who we described as ‘lower attainers’ – the 40% of each GCSE cohort who annually do not achieve a grade 4 (formerly C) in both English and maths.  We presented our findings at a recent CEPEO webinar.

Our research employed a mixed-methods approach combining analysis of data from the National Pupil Database (NPD) and Individualised Learner Record (ILR), collection and analysis of local data about course and apprenticeship opportunities and entry requirements, and interviews and focus groups.

It shows how, in making the transition to the post-16 phase and attempting to progress beyond GCSEs, ‘lower attainers’ face multiple barriers including: inconsistent careers information and guidance; restrictive entry requirements that are often based on English and maths GCSEs (even when it is not clear why specific grades are needed); considerable local variation in accessible provision; and the low availability and poor visibility of apprenticeships. Apprenticeships are not the accessible pathway for ‘lower attainers’ that many people imagine, with only 5.8% moving into an apprenticeship at 16 in the 2015 cohort, for example.

It also shows that many young people start their post-16 phase on courses below the levels of learning they have already achieved and that learners with similar attainment at 16 enter the post-16 phase at different levels in different places, partly due to local differences in the mix of provision and institutional practices. This has potential repercussions for the achievement of Level 2 and Level 3 qualifications between 16 and 18/19.

Making the problems and solutions more visible

All this points to a complex and locally variable picture that needs to be better understood.  But achieving clarity and understanding is very difficult due to the way attainment is measured and administrative data is collected, organised and made accessible.

Published statistics do not make the achievements and trajectories of lower attaining young people very visible, probably because much of the policy focus to date has been on raising KS4 attainment at the standard benchmarks. Coverage of lower-level qualifications (and of spatial variations) still lags behind.

And beyond the published statistics, there are major problems with the capacity for detailed analysis of the underlying data.

One issue is the data itself. Currently, we have two different large-scale administrative datasets for the post-16 phase – the NPD and ILR – with different definitions, variables and standards of documentation, and including different learners.  Getting access to these involves a lengthy and difficult application procedure, and working with the data to summarise what learners are doing and achieving is a painstaking process. Looking at academic routes is easier than tracking routes through vocational courses and apprenticeships because matching NPD (Key Stage 4) to NPD (Key Stage 5) is easier than matching NPD to ILR.  It is easier to look at outcomes than it is to understand progress and what learners are actually doing.  So analysis often focuses on qualifications achieved as the data is collected in this way.  We tried a different approach.  We developed a measure of a learner’s ‘main level of learning’ – the level that they were spending most of their guided learning hours on – and thus were able to illuminate progression (or not) from levels already achieved.  If the data sources were easier to access and use, much more could be done to analyse and explain course changes and progression between 16 and 19 and to understand what constitutes success and progress.

At a local level, basic information on the system in terms of the nature of provision at any given time as well as associated entry requirements is not routinely collected. To shed light on these issues, we had to collect and aggregate this information from provider and national agency websites, a labour-intensive task. The lack of available data leaves policy-makers unsighted as to what is on offer, who is missing out, and which gaps need to be plugged.

The other issue is analytic capacity.  Even if there were better data, there is a paucity of academics with interests and expertise in further education and training compared with the numbers working on school and higher education research. And we need more research teams who can combine quantitative and qualitative methods to investigate the relationship between the pre and post-16 phases. Changing this now will require not just funding for projects and centres but investment in early-career scholarship, addressing status issues and links to teaching. And there are insufficient links between people who have the skills for data analysis and practitioners who understand how the system works on the ground. Cuts to local authority funding have further diminished local capacity and intelligence.

Thus, if the characteristics and trajectories of lower attainers at GCSE are to be better understood on an ongoing basis, three substantial changes will need to be made:

  • Routine reporting of sub-benchmark achievement in more detail, and at relevant subnational scales.
  • Improvement in data infrastructure and access.
  • Increase in research and analysis capacity, both in local government and in universities and research institutes, and better links between them.

These will not be cheap.  But if the government is serious about eroding the long-standing inequalities in post-16 progression, it simply must invest in making the situation more visible.

The research reported here was funded by the Nuffield Foundation, but the views expressed are those of the authors and not necessarily the Foundation. Visit www.nuffieldfoundation.org

Vaccine hesitancy in children and young adults in England

IOE Editor17 March 2021

By Patrick Sturgis, Lindsey Macmillan, Jake Anders, Gill Wyness

Children and young people are, mercifully, at extremely low risk of death or serious illness from the coronavirus and, for this reason, they are likely to be the last demographic in the queue to be vaccinated, if they are vaccinated at all. Yet, there are good reasons to think that a programme of child vaccination against covid-19 will eventually be necessary in order to free ourselves from the grip of the pandemic. In anticipation of this future need, clinical trials assessing the safety and efficacy of existing covid-19 vaccines on young people have recently commenced in the UK.

While children and young people experience much milder symptoms of covid-19 than older adults, there is currently a lack of understanding of the long-term consequences of covid-19 infection across all age groups and there have been indications that some children may be susceptible to potentially severe and dangerous complications. Scientists also believe that immunisation against covid-19 in childhood may confer lifetime protection (£), reducing the need for large-scale population immunisation in the future.

Most importantly, perhaps, vaccination of children may be required to minimise the risk of future outbreaks in the years ahead. If substantial numbers of adults refuse immunisation and the vaccines are, as seems likely, less than 100% effective against infection, vaccination of children will be necessary if we are to achieve ‘herd immunity’.

We now know a great deal about covid-19 vaccine hesitancy in general populations around the world from a large and growing body of survey and polling data and, increasingly, from actual vaccine uptake. Much less is known, however, about vaccine hesitancy amongst children and younger adults. Here, we report preliminary findings from a new UKRI funded survey of young people carried out by Kantar Public for the UCL Centre for Education Policy and Equalising Opportunity (CEPEO) and the London School of Economics. The survey provides high quality, representative data on over 4000 young people in England aged between 13 and 20, with interviews carried out online between November 2020 and January 2021. Methodological details of the survey are provided at the end of this blog.

Respondents were asked, “If a coronavirus vaccine became available and was offered to you, how likely or unlikely would you personally be to get the vaccine?”. While the majority (70%) of young people say they are likely or certain to get the vaccine, this includes 25% who are only ‘fairly’ likely. Worryingly, nearly a third express some degree of vaccine hesitancy, saying that they either definitely won’t get the vaccine (9%) or are that they are not likely to do so (22%).

Although there are differences in question wording and response alternatives, this represents a substantially higher level of vaccine hesitancy than a recent Office for National Statistics (ONS) survey of UK adults, which found just 6% expressing vaccine hesitancy, although this rose to 15% amongst 16 to 29 year olds.

Differences in vaccine hesitancy across groups

 We found little variation in hesitancy between male and female respondents (32% female and 29% male), or between age groups. However, as can be seen in the chart below, there were substantial differences in vaccine hesitancy between ethnic groups. Black young people are considerably more hesitant to consider getting the vaccine than other ethnic groups, with nearly two thirds (64%) expressing hesitancy compared to just a quarter (25%) of those who self-identified as White.  Young people who identified as mixed race or Asian[1] expressed levels of hesitancy between these extremes, with a third (33%) of mixed race and 39% of Asian young people expressing vaccine hesitancy. This ordering matches the findings for ethnic group differences in the ONS survey, where 44% of Black adults expressed vaccine hesitancy compared to just 8% of White adults.

To explore potential sources of differences in vaccine hesitancy, respondents were asked to state their level of trust in the information provided by a range of different actors in the coronavirus pandemic. The chart below shows wide variability in expressed levels of trust across different sources between ethnic groups, but most notably between Black young people and those from other ethnic groups. Young people self-identifying as Black were considerably less likely to trust information from doctors, scientists, the WHO and politicians and more likely to trust information from friends and family than those from other groups. Although in terms of overall levels, doctors, scientists and the WHO are most trusted across all groups. Encouragingly, only 5% of young people say they trust information from social media, a figure which was consistently low across ethnic groups.

We also find evidence of a small social class gradient in vaccine hesitancy, with a quarter (25%) of young people from families with at least one parent with a university degree[2] expressing vaccine hesitancy compared to a third (33%) of young people with no graduate parent.

We can also compare levels of vaccine hesitancy according to how young people scored on a short test of factual knowledge about science. [3]  Vaccine hesitancy was notably higher amongst respondents who were categorised as ‘low’[4] in scientific knowledge (36%) compared to those with ‘average’ (28%), and ‘high’ (22%) scientific knowledge. This suggests that vaccine hesitancy may be related, in part, to the extent to which young people are able to understand the underlying science of viral infection and inoculation and to reject pseudoscientific claims and conspiracy theories.

How much are differences in vaccine hesitancy just picking up underlying variation between ethnic groups in scientific knowledge and broader levels of trust? In the chart below, we compare raw differences in vaccine hesitancy for young people from the same ethnic group, sex, and graduate parent status (blue plots) with differences after taking account of differences in scientific knowledge and levels of trust in different sources of information about coronavirus. The inclusion of these potential drivers vaccine hesitancy do account for all of the differences between ethnic and social class groups. While Black young people are around 40 percentage points more likely to express vaccine hesitancy than their White counterparts, this is reduced to 33 ppts when comparing Black and White young people with similar levels of scientific knowledge and (in particular) levels of trust in sources of coronavirus information.

Our survey shows high levels of vaccine hesitancy amongst young people in England, which should be a cause for concern, given the likely need to vaccinate this group later in the year. We also find substantial differences in hesitancy between ethnic groups, mirroring those found in the adult population, with ethnic minorities – and Black young people in particular – saying they are unlikely or certain not to be vaccinated. These differences seem to be related to the levels of trust young people have in different sources of information about coronavirus, with young Black people more likely to trust information from friends and family and less likely to trust health professionals and politicians.

There are reasons to think that actual vaccine take up may be higher than these findings suggest. First, Professor Ben Ansell and colleagues have found a decrease in hesitancy amongst adults between October and February, a trend which was also evident in the recent ONS survey.  It seems that hesitancy is declining amongst adults as the vaccine programme is successfully rolled out with no signs of adverse effects and this trend may also be evident amongst young people. Given that parental consent will be required for vaccination for under 18s, it may be the case that parental hesitancy is as important for take up.

There may also have been some uncertainty in our respondent’s minds about what is meant by ‘being offered’ the vaccine, given there were no vaccines authorised for young people at the time the survey was conducted and no official timetable for immunisation of this group. Nonetheless, this uncertainty cannot explain the large differences we see across groups, particularly those between White young people and those from ethnic minority groups.

If the vaccine roll out is to be extended to younger age groups in the months ahead, we will face a considerable challenge in tackling these high levels of and disparities in vaccine hesitancy.

 

*Methodology*

The UKRI Covid-19 funded UCL CEPEO / LSE survey records information from a sample of 4,255 respondents, a subset of the 6,409 respondents who consented to recontact as part of the Wellcome Trust Science Education Tracker (SET) 2019 survey. The SET study was commissioned by Wellcome with additional funding from the Department for Education (DfE), UKRI, and the Royal Society. The original sample was a random sample of state school pupils in England, drawn from the National Pupil Database (NPD) and Individualised Learner Record (ILR). To correct for potentially systematic patterns of respondent attrition, non-response weights were calculated and applied to all analyses, aligning the sample profile with that of the original survey and the profile of young people in England. Our final sample consists of 2,873 (76%) White, 208 (6%) Black, 452 (12%) Asian, 196 (5%) Mixed, and 50 (1%) Other ethnic groups.  The Asian group contains respondents who self-identified as Asian British, Indian, Pakistani, Bangladeshi, Chinese or ‘other Asian’.

 

[1] Respondents in the Asian category are a combination of Indian, Pakistani, Bangladeshi, Chinese or ‘other Asian’ origin.

[2] We have not yet liked the survey data to the National Pupil Database and Individualised Learner Records which will enable us to use an indicator of eligibility for free school meals and IDACI. Currently we use parent graduate status as a proxy for socio-economic status.

[3] Once the survey is linked to the National Pupil Database we will be able to look across a wider range of measures of school achievement.

[4] There were ten items in the quiz, ‘low’ knowledge equated to a score of 5 or less, ‘average’ knowledge to a score of 6 to 8, and ‘high’ knowledge to a score of 9 or 10. Note that this test was administered in the previous (2019) wave of the survey.

This work is funded as part of the UKRI Covid-19 project ES/V013017/1 “Assessing the impact of Covid-19 on young peoples’ learning, motivation, wellbeing, and aspirations using a representative probability panel”.

Scarring effects of Furlough

IOE Editor2 February 2021

By Professor Paul Gregg, University of Bath

The Chancellors furlough scheme is a dam holding back a torrent of unemployment. A long history of research has shown that open unemployment has sizable costs to workers after they have returned to work – called scarring. But these scarring effects will not hit all workers equally – they will primarily impact those from the younger generation due to the important role of work experience in the process.

Furlough vs unemployment

For prime-age and older workers, the main cost of unemployment comes from the dislocation from the existing job. The quality of the replacement job match is lower because a range of experience and knowledge is underused. This can be specific knowledge to the firm, industry or occupation or the seniority/responsibility in the job role. Long-term unemployment sees greater loss of application of this knowledge and experience as jobs are further away from the old position across the domains just listed. Some of this cost of dislocation is recovered by later job moves but typically not that cost associated with long-term unemployment. Here then furlough is totally different than unemployment as there is no such dislocation. This also represents the economic value of keeping so many hard hit businesses afloat. It would take a lot of time for replacement businesses to start-up and then to grow to be as productive in their use of labour as those that would be closed without furlough and other supports. These supports are thus limiting the economic destruction of productive potential that a deep recession creates.

For younger workers, the story of unemployment is less about the lost job, which are generally lower paid entry positions. Rather it is the lack of accrual of the crucial work experience which attracts pay rises and allows job moves/promotions which attract even large pay rises. A years tenure in work for a young person generates pay growth 5% above that for an older experienced worker (more after one year in a job than 5 or so and at younger ages) and a job-to-job move generates around 12%. This is how young people progress through the labour market and build careers. Older workers also get pay rises when moving jobs and at short tenures but they are less common and smaller in magnitude. Here then furlough is likely to be very similar in its effects to unemployment. People are drawing a salary but not actually gaining experience or promotion opportunities.

Thus for older workers with extensive tenure in their current post, furlough should not result in the costs of unemployment but it will disrupt the gaining of experience and potentially job moves that are so essential for young people.

The outlook for the younger generation

In a bad recession youth unemployment and the proportion of young people not in work or education often rise to 25 to 30%. Upwards of 20% accumulate substantial periods (a year or more) out of work between the ages of 18 and 25. Now furloughing of young people has been very common, partly because of the sectors at the heart of the lockdowns but also because of their lower seniority. A million young people were furloughed in early July (the earliest I have found giving an age breakdown) – some 20% of the total at that time – whilst just under another million were not in work or education. This was around 30% of all young people either on furlough or out of work and college. Among those aged 18-24 it was 35%. By the end of October young people on furlough fell to 350,000 but is no doubt higher again now. The year of Covid overall is thus likely to have seen 25% of young people not in college not gaining normal work experiences, very much in line with a normal recession.

The better news is that young people in similar situations do recover a substantial portion of these wage loses. Graduates in a normal recession do not suffer a lot of unemployment but do get work in lower status and paying occupations, losing 3% pay growth per year in a suppressed labour market (normally for 3 years after a recession). But they do see faster earnings growth after a recession ends, recouping about half the losses. This is the likely situation for today’s Covid generation of young people. Provided of course, that the end of furlough is not associated with an explosion of youth unemployment.

The policy response to youth unemployment is a programme like the new Kickstart programme to give that missing experience. But whilst good number of places are promised by firms, there have been nearly no actual starts because of Lockdown. This can’t help until furlough ends. Rather it is making sure of a strong recovery from the Summer that is the only prescription that can limit the damage of lost work experience of young people through the pandemic.

Exams 2021: So what now? Part 2: CEPEO’s response to the DfE/Ofqual consultation on summer assessment 2020/21

IOE Editor21 January 2021

By Jake Anders, Lindsey Macmillan and Gill Wyness

Given the widespread disruption to learning this academic year and the substantial risk of continued disruption to schooling into the summer term, the government were right to take the decision to cancel exams in England in their usual form – indeed having done so earlier as many were calling for would have made the implementation of a wider range of alternatives feasible. But now that the government, working with Ofqual, have turned to decide how GCSE and A level grades should be awarded this year, what should they do? In our recent blog we made the case that assessment should be flexible in terms of timing and content, but that it should continue to be externally set and marked, to ensure fairness and rigour.

Unfortunately, the government’s new proposals do not take that message on board and instead take teacher assessment as a given. As a result, their current consultation is framed without allowing for the opportunity to consider this fundamental aspect of the government’s approach. In setting out these plans, Gavin Williamson said that they put their trust in teachers, not algorithms. But this is a false dichotomy, as our proposed approach shows. In addition, as we argued last year, asking teachers to assign grades accurately and fairly is asking them to do a near impossible task – and one that will add considerably to their hugely expanded workloads: fundamentally, trusting teachers can only go so far when it comes to achieving fair and rigorous assessment.

Nevertheless, since the framing gives no alternative, in our response to the consultation, we make suggestions that will minimise the unfairness that this approach will cause. In particular, we highlight that, if teacher assessment must be used, it must take unequal learning loss into account, and it must be subject to a system of external quality assurance.

Dealing with Learning Loss

Assessment is important, not just so that students can continue to the next stage of their education, training or employment, but also to ensure that they continue to engage with schooling for the remainder of the academic year and, hence, minimise the learning loss that will be experienced. We therefore agree that students should be assessed in some manner, and that this should be through papers set by the exam boards and provided to the schools (as is proposed). Both for this reason and wider aims of fairness, it should be compulsory for these to be used by schools as the primary basis of the teacher assessed grades for both GCSEs and A levels. Flexibility in the timing of these assessments will allow this possibility despite the ongoing risks to disruption of schooling.

However, as is widely documented, pupils have had very different experiences of learning this year, so they will be at very different stages when they come to be assessed. For this reason, it is deeply unfair to award pupils grades based solely on the standard they are performing when they are assessed(which is proposed to be at some point between May-June 2021). While it is important to push the assessment date to the latest time period possible (to allow students maximum time to catch up), it is unlikely that students will be able to recover from lost learning, and it is inevitable that students will be at different levels when they are assessed through no fault of their own.

This is fundamentally different from the philosophy that DfE and Ofqual have taken according to the consultation document, which states that students should be assessed at the standard at which they are currently performing. While we agree that it is important that these grades proxy pupils’ potential for that next stage, given the important role they play in the transition to further education and employment, it cannot be fair for pupils whose education has been disrupted the most to be systematically disadvantaged by an approach that ignores this. As such, it is vital that this year’s assessment system take this unequal opportunity to learn into account.

An important aspect of that would be for the papers set by exams boards to have several flexible components. There should be flexibility in the timing, to ensure that all pupils are able to sit them in their educational setting despite the risks of further disruption. The papers themselves should also be flexible, with teachers able to account for differentially disrupted curricula by deciding which topics are covered in the questions that students are asked to answer.

Quality assurance

Given the exam boards will be required to set these exams, the best approach would be also to use their expertise in marking them. As well as being far more rigorous, using the exam boards’ available, paid workforce to do the marking would avoid placing a huge additional burden on teachers’ workloads, as well as avoiding the risks of exposing them to unfair pressure from pupils and parents.

But in the absence of this option, we agree with Ofqual that exam boards should still play an important role in providing assessment guidance and monitoring. We agree with the proposals to involve exam boards in providing support and information to schools and colleges to help them meet the assessment requirements, and to ensure internal quality assurance. Exam boards should also be involved in external quality assurance. At the very least this should include extensive sampling, at subject level, the evidence on which the submitted grades were based. Judging by last year’s experiences, there is good reason to suggest that independent schools should be a particular focus of external quality assurance activity.

We also argue that the exam boards should be responsible for the appeals processes, rather than schools and teachers being involved in reconsidering the marks they have provided. Again, this distance between candidate and assessor is vital to ensure a rigour and fairness in the process that is not susceptible to inappropriate pressure, while also protecting individual teachers and schools from unfair criticism from parents and the media.

Finally, it is crucial that the appeals process take place before universities receive students’ grades. This is critical to avoid the deeply unfair situation of last year, with students apparently missing offers and losing their university place, only to have their grades later overturned.

Making these decisions quickly will provide much needed clarity for schools, pupils and their parents. However, the serious problem of learning loss will remain. Students transitioning to further education or into the labour market will be doing so having received less education than in a normal year. Adjusting grades to take account of this is a necessary short-term solution to avoid embedding unfairness in the transition process, but even more important is a plan to support catch up for all those who have fallen behind, which will be most acute for students from disadvantaged backgrounds. This will require significant commitment and investment. This needs to be recognised immediately to prevent further delay.

Exams 2021: So what now?

IOE Editor4 January 2021

By Jake Anders, Lindsey Macmillan, and Gill Wyness

While the uncertainties of a global pandemic make this one of the most volatile periods of education policy in history, if there is one lesson we should all have learned since last March, it is that indecision is costly. This has proven true repeatedly for public health and looks just as relevant for education. As we saw with the exam fiasco of summer 2020, the failure to act decisively led to there being little alternative but to assign students grades based on teachers’ predictions of what they would have achieved. This sub-optimal situation removed any final contribution on the part of the student, and, more importantly, resulted in significant biases across school type and family background. Of course, back in summer 2020, the government had little time for the advance planning that any alternatives (such as ongoing assessment) would have required. But this year, they have no such excuse, and inaction now poses the substantial risk of being left without alternatives again. That is why the government must act now to ensure that we don’t have a repeat performance in summer 2021.

For exams to give all pupils the same chance to succeed, one of the pre-requisites is that they have had the same amount of time to prepare. However, we know that is not the case from looking at patterns in disruption to their studies. While both exam cohorts (year 11 and year 13) missed up to 5 months schooling in the academic year 2019-20, the disruption has continued during this crucial exam year and in much less uniform a manner. Unfortunately, England does not publish data on attendance rates by year group, but we can look more broadly at attendance rates in all state-funded schools by region over the autumn term. The figure below illustrates that while attendance rates started the academic year between 85% and 95%, by mid-November we were seeing rates substantially below this (falling from 88% to 83% on average) driven by widespread – but regionally varying – self-isolation by both individual pupils and education ‘bubbles’. In mid-November, attendance rates were lowest in the North West and Yorkshire. By mid-December, with what we now understand to be the prevalence of the new variant increasing, London, the East, and the South East had all seen stark declines in their attendance rates. In contrast, the South West has remained near the top of attendance rates throughout.

Figure: Weekly attendance in state-funded schools by region, 10th September 2020 – 10th December 2020.
Source: https://explore-education-statistics.service.gov.uk/find-statistics/attendance-in-education-and-early-years-settings-during-the-coronavirus-covid-19-outbreak

 

This disruption seems likely to get worse still. The Christmas holidays were anything but a break for schools and teachers, with an announcement on setting up in-school testing released shortly before the end of term (here) and the long-awaited announcement on returning to school made by DfE on December 30th. Secondary schools across the country have now moved to remote learning this week. While the majority of primary schools remain open, an increasing proportion (upwards of 15%) will not open their doors to pupils for the foreseeable future, either under direct DfE instruction through the schools contingency framework, or acting unilaterally over fears for teachers’ and students’ health. The DfE currently state that the majority of schools will re-open on January 18th, but with spiralling infection rates and stretched hospital capacities in every region, this position looks increasingly untenable. We await the Prime Minister’s announcement this evening, but many suspect that all schools will be closed for the foreseeable future.

In a blog post from November (here), we laid out the evidence that points to exams being the best route forward for school pupils in 2021, but advocating important changes (particularly focused on allowing greater flexibility),  given the uncertainty that was already evident at that point. It is becoming increasingly clear that exams, at least in their usual form, cannot go ahead – this makes the changes that we continue to call for vital and urgent. The current exam plans cannot provide the level playing field that it is claimed they will deliver, given the extent of differential learning experiences of those from different regions and backgrounds in this school year alone. So what now?

A levels and GCSEs

The evidence clearly points to avoiding centre or teacher assessed grades where possible. We, therefore, argue that externally set and marked exams remain the fairest option to all pupils taking terminal exams.

But these do not have to take place in the current format proposed, during a three-week period in June 2021. Instead, there is a strong case for more flexible timing for testing pupils, allowing exams to be spread across the summer term and, crucially, allowing pupils to sit these exams at different times to deal with any continuing need for closures during this period. While this will involve more work for exam boards given the need to provide multiple versions of each exam, this is the fairest way to ensure that pupils do not miss out on external assessments. The fact that it requires more work only underlines the need for swift action.

Further, we must ensure that this year’s exams include flexible content. This would help to reduce the unfairness caused by the fact that different schools will have been able to cover different content through interruptions to in-person schooling. These reformed exams would be more like university finals: pupils could be given a wider set of options and be asked to answer a smaller proportion of these, for example, 2 questions from 6 alternatives covering a wide sweep of the curriculum.

This approach would have substantial similarities with that already announced in Wales – also supporting fairness for university applications between applicants from the two countries – and would ensure that pupils can still be awarded grades that they have earned while providing robust information on achievement for universities and future employers. Scotland, on the other hand, has cancelled their exams altogether – with National 5s (the GCSE equivalent) cancelled several months ago, and Highers (the A level equivalent) cancelled just before Christmas. Scotland will instead base awards on teacher judgement, and while this is not an optimal situation, announcing this well in advance gives schools and teachers ample time for ongoing assessment and observation.

Primary school testing

While Key Stage 1 tests have been suspended for 2021, current plans are for Key Stage 2 tests to go ahead, although the school-level results will not be published. Given that these tests are primarily used as indicators of school performance, which is going to be measured with substantial error this year, there are serious questions about their value to bodies such as Ofsted with whom they are still proposed to be shared for accountability purposes. As such, there is a strong case for abandoning these tests altogether given the current circumstances. This would significantly reduce the burden on primary school teachers, who are working under very difficult conditions, and would remove the stress on pupils and parents associated with preparing for these tests under such difficult circumstances.

Action this day

The longer it takes for these steps to be taken, the harder it will be for them to be implemented, until the point where they are no longer feasible. At that point, there is a major risk of a repeat of last year’s fiasco – but without the excuse of not having had time to prepare a better alternative. We’ve seen yet another example today of the decision making process in Whitehall lagging behind that of Holyrood. In the words of the Scottish national anthem, it’s time for the Prime Minister “tae think again.”

A is for Applications

IOE Editor18 November 2020

By Dr. Gill Wyness, Professor Lindsey Macmillan, Dr. Richard Murphy and Dr. Stuart Campbell

Last week was a big week for university admissions with UCAS leading new calls for reforms to the process on Monday, followed on Friday by support from Universities UK, and the much-welcomed announcement from the Department for Education that plans are underway to move to a Post-Qualification Admissions (PQA) system. Based on our research, in this post, we explain why it is clear that the A in PQA must mean Applications, rather than Admissions.

The central reason for this is that all other options rely on the use of predicted grades, which are both inaccurate and unfair, and so should not be used at any point in this process. Here we put forward a plan for how a Post-Qualification Applications system could be achieved, central to which is reducing the time taken to mark exams. This is a once-in-a-generation chance to re-design a flawed system. We must not waste this chance by recreating a system that continues to embed systematic inequalities by using predicted grades.

An outdated applications calendar

While at first glance the UK’s centralised applications system (University and College Admissions Service – UCAS) might appear modern, we have actually had a centralised applications system since 1961 (then known as UCCA – the University Central Council on Admissions). Since then there have been major advancements in technology, including the move from paper forms with traditional mail and hand processing, to online applications, and the computerisation of exam marking. Yet despite these changes, the calendar of the applications and admissions process has remained stubbornly fixed for the past 60 years. The inertia of this system has been widely accepted, but we believe there are clear opportunities for efficiency gains.

The acceptance of this inertia can be seen in discussions of what a new post-qualifications system might look like. There has been a plethora of ideas, debating the pros and cons of post-qualification decisions, offers, and applications (PQD, PQO, PQA) all of which appear to take the current grading duration as given. In a lot of the discussion that has followed, there is a frequent assumption that in order to move to post-qualification applications, A levels will need to happen much earlier, or universities start much later. This acceptance of the status-quo in terms of timing creates the very real possibility that this opportunity for change will be squandered.

What is the problem with PQO or PQD?

The crucial problem with the alternatives to post-qualification applications is that they still rely on predicted grades. As is very clear from the DfE press release, the significant inaccuracies in predicated grades, and the systematic differences across students, is the key reason for this reform in the first place.

Our recent study showed that only 16% of students receive accurate predictions. While the majority of students are overpredicted, high achieving students from disadvantaged backgrounds are typically underpredicted.

In another recent study, we highlight the difficulty of predicting grades, showing that when relying on machine learning and advanced statistical techniques only 1 in 4 students were accurately predicted. This also showed that it was harder to accurately predict grades for high-achieving state school students, relative to their selective grammar school or private school counterparts.

The implications for relying on predicted grades is also clearly explained in the DfE press release:

Disadvantaged students are more likely to ‘under-match’ and enter courses below their ability than their advantaged peers. Under-matched students are then more likely to drop out of university, get a lower-class degree and earn less in employment.”

The phenomenon of undermatch had not been formally documented until recently in the UK. Our paper highlighted for the first time that students from more disadvantaged families systematically attend lower tariff universities than they could do given their A level achievement, relative to their more advantaged counterparts. Predicted grades are a driver of this phenomenon – we found that those disadvantaged students who were underpredicted ended up being overqualified, or unmatched, in terms of both their applications to university and where they ended up going.

Reforms in which students are still making their applications on the basis of predicted grades (as is the case with many of the options being discussed), will not solve this undermatch problem, since students will still be making their applications on the basis of inaccurate predictions. It is therefore crucial that students are allowed to make their applications on the basis of actual, rather than predicted grades.

So how do we deliver PQA (Applications)?

Our proposed approach to PQA is based on utilising the efficiency gains that should be made from our archaic applications system. We argue that given the technological advancements made over the past 60 years, there is no obvious reason why we need to stick to the current timeline. These gains can be made both in terms of how quickly exams are marked, and in terms of how long it takes to process university applications.

Stage 1: Exams and student applications

We suggest virtually no change is needed to the timing of A levels, allowing teachers the time to teach the full curricula. Examinations would still occur in early May. Once these exams are over, A level students could return to school for a range of ‘forward-looking’ activities, including university and careers guidance, work experience, financial literacy training and, in the final week, an ‘applications week’. Keeping students in school for the full school year addresses concerns regarding disadvantaged students not receiving guidance.

In this ‘applications week’, students would receive their grades and apply to their chosen courses, with the support of their school teachers. This would have a neutral impact on teacher workload as they would no longer have to predict grades during the year – their support for applications would shift from earlier the year, and perhaps would no longer have to help with personal statements (see below). Or schools could take on specialists to conduct this advisement.

For this to be achieved, results day would only need to be brought forward by 2-3 weeks. The condensing of the marking period would be achieved through a combination of the previously discussed technological improvements, and increased investment to employ more markers during this intense period to ensure quality is unaffected.

Stage 2: University processes

Universities would receive all applications by the end of July and would have a finite period (a month) to process these and make offers. Given that they will be receiving applications based on final grades, it is possible that the entire application process could be simplified, relying less on ‘soft metrics’ such as personal statements or teacher references which can be affected by bias. Where applicants have identical grades, universities could look at the students’ final grades relative to their schools’ performance, baking in some Widening Participation targets to the process. Those students with most ‘potential’, outperforming their school’s results, would be prioritised among ties. Students would receive their offers at the end of August, only a few weeks after they do in the current system, and accept their favoured choice.

Creating a more agile system

When setting out his rationale for considering post-qualification admissions, the Education Secretary Gavin Williamson said he wanted to “remove the unfairness” that some groups currently face due to inaccurate predicted grades.

The only way to remove this unfairness is to remove predicted grades from the application process altogether and create a post-qualification applications system.

Rather than shifting the timing of examinations and university start dates, we argue that this can be achieved by harnessing technological efficiencies which have emerged over the 60 years that the current system has been in place. This is a once-in-a-generation chance to re-design our system to make it fairer for all young people. Allowing them to make critical decisions based on their own merits, rather than inaccurate predictions, is a critical step.