X Close

Centre for Education Policy and Equalising Opportunities (CEPEO)


We create research to improve the education system and equalise opportunities for all.


Archive for the 'Tertiary education' Category

A is for Applications

IOE Editor18 November 2020

By Dr. Gill Wyness, Professor Lindsey Macmillan, Dr. Richard Murphy and Dr. Stuart Campbell

Last week was a big week for university admissions with UCAS leading new calls for reforms to the process on Monday, followed on Friday by support from Universities UK, and the much-welcomed announcement from the Department for Education that plans are underway to move to a Post-Qualification Admissions (PQA) system. Based on our research, in this post, we explain why it is clear that the A in PQA must mean Applications, rather than Admissions.

The central reason for this is that all other options rely on the use of predicted grades, which are both inaccurate and unfair, and so should not be used at any point in this process. Here we put forward a plan for how a Post-Qualification Applications system could be achieved, central to which is reducing the time taken to mark exams. This is a once-in-a-generation chance to re-design a flawed system. We must not waste this chance by recreating a system that continues to embed systematic inequalities by using predicted grades.

An outdated applications calendar

While at first glance the UK’s centralised applications system (University and College Admissions Service – UCAS) might appear modern, we have actually had a centralised applications system since 1961 (then known as UCCA – the University Central Council on Admissions). Since then there have been major advancements in technology, including the move from paper forms with traditional mail and hand processing, to online applications, and the computerisation of exam marking. Yet despite these changes, the calendar of the applications and admissions process has remained stubbornly fixed for the past 60 years. The inertia of this system has been widely accepted, but we believe there are clear opportunities for efficiency gains.

The acceptance of this inertia can be seen in discussions of what a new post-qualifications system might look like. There has been a plethora of ideas, debating the pros and cons of post-qualification decisions, offers, and applications (PQD, PQO, PQA) all of which appear to take the current grading duration as given. In a lot of the discussion that has followed, there is a frequent assumption that in order to move to post-qualification applications, A levels will need to happen much earlier, or universities start much later. This acceptance of the status-quo in terms of timing creates the very real possibility that this opportunity for change will be squandered.

What is the problem with PQO or PQD?

The crucial problem with the alternatives to post-qualification applications is that they still rely on predicted grades. As is very clear from the DfE press release, the significant inaccuracies in predicated grades, and the systematic differences across students, is the key reason for this reform in the first place.

Our recent study showed that only 16% of students receive accurate predictions. While the majority of students are overpredicted, high achieving students from disadvantaged backgrounds are typically underpredicted.

In another recent study, we highlight the difficulty of predicting grades, showing that when relying on machine learning and advanced statistical techniques only 1 in 4 students were accurately predicted. This also showed that it was harder to accurately predict grades for high-achieving state school students, relative to their selective grammar school or private school counterparts.

The implications for relying on predicted grades is also clearly explained in the DfE press release:

Disadvantaged students are more likely to ‘under-match’ and enter courses below their ability than their advantaged peers. Under-matched students are then more likely to drop out of university, get a lower-class degree and earn less in employment.”

The phenomenon of undermatch had not been formally documented until recently in the UK. Our paper highlighted for the first time that students from more disadvantaged families systematically attend lower tariff universities than they could do given their A level achievement, relative to their more advantaged counterparts. Predicted grades are a driver of this phenomenon – we found that those disadvantaged students who were underpredicted ended up being overqualified, or unmatched, in terms of both their applications to university and where they ended up going.

Reforms in which students are still making their applications on the basis of predicted grades (as is the case with many of the options being discussed), will not solve this undermatch problem, since students will still be making their applications on the basis of inaccurate predictions. It is therefore crucial that students are allowed to make their applications on the basis of actual, rather than predicted grades.

So how do we deliver PQA (Applications)?

Our proposed approach to PQA is based on utilising the efficiency gains that should be made from our archaic applications system. We argue that given the technological advancements made over the past 60 years, there is no obvious reason why we need to stick to the current timeline. These gains can be made both in terms of how quickly exams are marked, and in terms of how long it takes to process university applications.

Stage 1: Exams and student applications

We suggest virtually no change is needed to the timing of A levels, allowing teachers the time to teach the full curricula. Examinations would still occur in early May. Once these exams are over, A level students could return to school for a range of ‘forward-looking’ activities, including university and careers guidance, work experience, financial literacy training and, in the final week, an ‘applications week’. Keeping students in school for the full school year addresses concerns regarding disadvantaged students not receiving guidance.

In this ‘applications week’, students would receive their grades and apply to their chosen courses, with the support of their school teachers. This would have a neutral impact on teacher workload as they would no longer have to predict grades during the year – their support for applications would shift from earlier the year, and perhaps would no longer have to help with personal statements (see below). Or schools could take on specialists to conduct this advisement.

For this to be achieved, results day would only need to be brought forward by 2-3 weeks. The condensing of the marking period would be achieved through a combination of the previously discussed technological improvements, and increased investment to employ more markers during this intense period to ensure quality is unaffected.

Stage 2: University processes

Universities would receive all applications by the end of July and would have a finite period (a month) to process these and make offers. Given that they will be receiving applications based on final grades, it is possible that the entire application process could be simplified, relying less on ‘soft metrics’ such as personal statements or teacher references which can be affected by bias. Where applicants have identical grades, universities could look at the students’ final grades relative to their schools’ performance, baking in some Widening Participation targets to the process. Those students with most ‘potential’, outperforming their school’s results, would be prioritised among ties. Students would receive their offers at the end of August, only a few weeks after they do in the current system, and accept their favoured choice.

Creating a more agile system

When setting out his rationale for considering post-qualification admissions, the Education Secretary Gavin Williamson said he wanted to “remove the unfairness” that some groups currently face due to inaccurate predicted grades.

The only way to remove this unfairness is to remove predicted grades from the application process altogether and create a post-qualification applications system.

Rather than shifting the timing of examinations and university start dates, we argue that this can be achieved by harnessing technological efficiencies which have emerged over the 60 years that the current system has been in place. This is a once-in-a-generation chance to re-design our system to make it fairer for all young people. Allowing them to make critical decisions based on their own merits, rather than inaccurate predictions, is a critical step.

How should we assess students this year, and what are the implications for universities?

IOE Editor10 November 2020

By Professor Lindsey Macmillan, Dr. Jake Anders, and Dr. Gill Wyness

In summer 2020, to much controversy, the UK government cancelled both GCSE and A level exams and replaced them with “Centre Assessed Grades” based on teacher predictions. While Scotland has cancelled some exams in 2021, and Wales appear to have arranged for something akin to exams to take place in a classroom setting, the English Government remains adamant that their exams will go ahead as planned. This strategy is not without its problems, but with some important adjustments, it’s still the best and fairest way to assess pupils.

Primary and secondary schools closed their doors in late March 2020 and only fully re-opened 6 months later in September. Schooling has continued to be disrupted for many, when classes or other ‘bubbles’ have to self-isolate due to suspected COVID outbreaks, meaning that learning has to move online. This situation is likely to result in further unequal “learning loss” as a result of inequalities in-home learning environments, including technology to reliably access lessons online.

Recent work by Ofsted reported widespread learning loss as a result of these closures, with younger pupils returning to school having forgotten basic skills, and older children losing reading ability. But the loss is not evenly distributed; Ofsted reported that children with good support structures were doing better than those whose parents were unable to work flexibly. Several analyses (e.g. Andrew et al, 2020; Anders et al, 2020) back this up, reporting that pupils from better-off families spent more time on home learning, and were much more likely to have benefitted from online classes than those from poorer backgrounds. Work by the Sutton Trust found that children in households’ earnings more than £60,000 per year were twice as likely to be receiving tutoring during school closures compared to those earnings less than £30,000. While steps have been put in place to help pupils catch up, such as the pupil catch-up premium and the National Tutoring Programme, pupils this year will almost certainly be at a disadvantage compared to previous cohorts when they face this year’s exams, and the severity of disadvantage is likely to vary by family background.

While this might be evidence enough that exams should be cancelled this year, it is worth first considering that the alternatives:

  1. Continuous teacher assessment

Perhaps the most obvious alternative to exams is continuous teacher assessment, through the use of coursework, in-class testing and so on. This would negate the need for exams and would mean all students would receive a grade in the event that exams have to be cancelled due to a resurgence in the pandemic. Scotland has already committed to using teacher assessment instead of exams for their National 5s (equivalent to GCSEs) this year. While this does seem like a safe choice to replace exams, research has shown that teacher assessment can contain biases. For example, Burgess and Greaves (2013) compared teacher assessment versus exam performance at Key Stage 2, finding evidence of black and minority students being under-assessed by teachers, versus white students. Campbell (2015) similarly shows that teacher’s ratings of pupils’ reading and maths attainment at age 7 varies according to income, gender, Special Education Need, and ethnicity.

Using coursework to assess pupils (whether internally or externally marked and/or moderated) also risks interference from parents and schoolteachers, so that a pupil’s eventual grade could be more a reflection of the support they’ve received rather than their own achievements. And levels of support are likely to vary by SES, again putting those from poorer backgrounds at a disadvantage.

2. Teachers’ predictions

But sticking with exams is not without its risks. It is, after all, a pandemic, and the government could be forced to cancel exams at the last minute. If they leave it too late to implement continuous teacher assessment or an alternative form of external assessment then they will have to turn to more reactive measures – such as asking teachers to predict pupils’ grades (the method finally adopted for the 2020 GCSE and A level cohorts). This would at least have the advantage of being consistent with last year, but, again would likely result in biased measures of achievement. Predicted grades have been shown to be inaccurate, with the vast majority overpredicted (causing headaches for university admissions). However, work by Anders et al. (2020) and Murphy and Wyness (2020) showed that among high achieving pupils, those from low SES backgrounds and state schools are harder to predict and end up with lower predictions than their more advantaged counterparts.

3. A school leaving certificate?

There are more radical possibilities to consider. One is for schools to abandon assessment this year altogether, and to simply issue students with school leaving certificates, similar to that received in America for graduating high school. This would certainly level the playing field among school leavers. But it could lead to some big problems for what comes next. For example, without A level grades, how would universities decide which applicants to accept?  Under this scenario, admissions tutors would become increasingly reliant on ‘soft metrics’ such as personal statements, teacher references and interviews. This may also lead to the more widespread use of university entry tests, which are already in place at some institutions.  All of this is likely to be bad news for social mobility since the use of “soft metrics” has been shown to induce bias (Wyness, 2017; Jones, 2016) while there is very little evidence about the equity implications of using aptitude tests, except in highly specific settings (Anders, 2014) so the potential for unintended consequences is substantial.

But in theory, universities shouldn’t need to use entry tests – these pupils already have grades in national tests – their GCSEs. For this university entry cohort, they were sat before the pandemic, and are high-stakes, externally marked assessments. Indeed, Kirkup et al. (2010) find no evidence that the SAT (the most widely used aptitude test in the US) provides any additional evidence on performance once at university than using GCSE results on their own. Many universities already use GCSE grades as part of their admissions decision along with predicted A level grades. Yet these grades were measured two years ago now – and so will obviously miss any changes in performance since then. Indeed, recent work by Anders et al. (2020) suggests that GCSE performance is a poor predictor of where students are at, in terms of achievement, at the end of their A levels. Using administrative data and machine learning techniques, they predict A level performance using GCSEs, finding that only 1 in 3 pupils could be accurately predicted, and that certain groups of students (those from state schools and low SES backgrounds) appeared to be “underpredicted” by their GCSEs, going on to outperform at A level.

An alternative approach to exams?

The alternatives to exams raise many concerns, particularly for those from poor backgrounds. A better solution may be to design A level exams to take account of the learning loss and missed curricula experienced by pupils, and the fact that some pupils will have experienced this to different degrees. Ofqual was dismissive of this suggestion in their report on examinations for 2020/21, pointing to burden on exam boards among other factors, but while we take seriously the considerations they highlight, we think this underestimates the challenges of the status quo.

For all the headlines about Wales “cancelling” exams, from a first look, it appears that this is rather a simplistic summary. They are still planning to hold some kind of examination, which will be both externally set and externally marked, but when these will take place is now more flexible, and they will happen in class rather than in exam halls – ironically, removing the in-built social distancing normally associated with examinations. This kind of flexibility is needed in these difficult circumstances.

An alternative that has also been discussed in England is that exams could be redesigned so that the majority of questions are optional. In this way, they would look more like university finals, in which students are typically given a set of questions, and need only answer a subset of their choice – e.g. answer 2/7 questions. This would take account of the fact that pupils may have covered different aspects of the curricula but not all of it, since they need only answer the questions they are prepared for. While appreciating there are challenges with this approach, a carefully designed exam would at least provide pupils with a grade they have earned and would provide universities and employers with the information needed to assess applicants.

Universities should also be aware that students from different backgrounds will have experienced lockdown in very different ways, and those lacking school and parental support may still struggle to do well, even in well-modified exams. This could and should be tackled with the increased use of contextual admissions. Universities often cite fears that students from contextual backgrounds are more likely to arrive underprepared for university and risk failing their courses. But this year, lack of preparation for university may well be the norm, forcing universities to provide extra tuition and other assistance to help students get “up to speed”. There has never been more need, and more opportunity, for widespread contextual admissions.

Covid-19: The risk of a double hit to young people’s wellbeing

IOE Editor23 October 2020

By Dr. Jake Anders

The negative effects of the covid-19 pandemic and its associated restrictions on people’s wellbeing, especially young people’s wellbeing, have been widely highlighted since the onset of lockdowns in March. Unfortunately, there are reasons to believe that even when the direct effects of the pandemic come to an end, there is a continuing risk to young people’s wellbeing from the more long-lived effects of the associated economic downturn that is only just starting.

In a recent research study that John Jerrim, Phil Parker and I carried out, we explored the impact of the last recession (the 2008-09 global financial crisis) on the wellbeing of young people in the Australian context. Our research used data from four different cohorts of their Longitudinal Surveys of Australian Youth (LSAY), for young people born in 1981, 1984, 1987 and 1990.

Because young people’s wellbeing varies as they age, it’s not as simple as comparing the same cohort of young people’s wellbeing before and after the onset of an economic downturn. The difference that we observe might just be caused by those changes as young people get older. To address this issue, we attempted to isolate the effect of this event by comparing trends in reported wellbeing among overlapping cohorts of young people before, during and after this period. Since young people in these different cohorts experienced this event at different ages, we are able to verify that their wellbeing initially evolved in a similar manner, before comparing what happened as the economic challenge hit.

The basic idea of our results is evident from the following graph – although we also applied further statistical modelling to check the robustness of our findings in the paper.

You can follow the average reported level of each cohort’s wellbeing as separate lines reporting annually from when members of the cohort are 17, up to when they are 26. All the lines start out as solid lines – which indicates measures collected before the onset of the economic downturn – before becoming dashed lines after this event.

Because the cohorts were born in different years, the onset of the global financial crisis (and, hence, the change from solid to dashed line in our graph) happens at different ages. Only the cohorts born in 1987 and 1990 experience the onset of the global financial crisis between ages 17 and 26, so only these two lines change to being dashed. The cohorts born in 1981 and 1984 retain a solid line throughout.

The graph suggests that the cohorts born in 1987 and 1990 had similar (or slightly higher) levels of wellbeing as the older cohorts at younger ages. However, a substantial gap emerges between the wellbeing of the earlier and later cohorts, starting at age 19 for the 1990 cohort and age 22 for the 1987 cohort. This indicates that a negative impact on wellbeing was caused by the onset of the global financial crisis.

What are the lessons of this for the current situation? Unfortunately, it suggests that even if negative effects on young people’s wellbeing dissipate when restrictions are eased, the longer-term effects on well-being of the onset of the economic downturn are likely either to prolong these or add to them further. This further increases the importance of policymakers doing all they can to alleviate the negative effects of the pandemic on the economy, and in particular on the challenges that young people now seem likely to face in taking their first steps into the labour market.

The full article on which this blog post is based is freely available online: Parker, P., Jerrim, J., & Anders, J. (2016) What effect did the Global Financial Crisis have upon youth wellbeing? Evidence from four Australian cohorts. Developmental Psychology, 52 (4), 640-651.

Predicted grades – what do we know, and why does it matter?

IOE Editor11 August 2020

By Dr. Gill Wyness

Whose grades are being predicted?

Predicted grades are a common feature of the English education system, with teachers’ predictions of pupils’ A level performance forming the basis of university applications each year.

What’s different this year?

The Covid-19 pandemic has put these predictions under the spotlight. The cancellation of exams means that all year 11 and year 13 pupils will instead receive ‘calculated grades’ based on teacher predictions.

How well do teachers predict grades?

Teachers’ predicted grades have been shown to be inaccurate but the majority of inaccurate grades are over-predicted – in other words, too high.

  • There is limited research on the impact of predicted grades, though studies of prediction accuracy by individual grade (e.g. how many A’s were predicted to be A’s) by Delap (1994) and Everett and Papageourgiou (2011) showed around half of all predictions were accurate, while 42-44% were over-predicted by at least one grade, and only 7-11% of all predicted grades were under-predicted.
  • Studies of prediction accuracy according to a student’s best three A levels show even higher rates of inaccuracy (unsurprisingly, since it is harder to predict all three A levels correctly). For example, Wyness and Murphy find that only 16% of students received accurate predictions for all three, with 75% overpredicted and just 8% underpredicted.

Who loses out?

Lower achieving students tend to be overpredicted; higher achieving students tend to be more accurately predicted.

  • All studies find that higher grades are more accurately predicted than lower grades. This is likely an artefact of the combination of teachers’ tendency to overpredict coupled with ceiling effects. Overprediction is impossible for the top grades so accuracy is the consequence.
  • Thus, AAA students are likely to be accurately predicted (or underpredicted) whereas CCC students are more likely to be overpredicted.
  • It is therefore essential to take into account the achievement level of the student when analysing prediction accuracy by student characteristics. For example, low SES students tend to be lower-achieving, on average. Therefore, low SES students tend to be overpredicted on average, while high SES students tend to be more accurately predicted (this is shown by Wyness and Murphy).

So are teachers biased?

There is little evidence of bias in prediction accuracy according to student characteristics.

  • The majority of the studies above show no compelling evidence of bias in teacher prediction by student characteristics, once achievement is taken into account.
  • Though Wyness and Murphy show that among high achievers, state school students receive slightly less generous predictions than those in independent schools and that those from low SES backgrounds receive slightly less generous grades than those from high SES backgrounds
  • This was not a causal finding, and other factors could be driving this apparent bias.

What’s going wrong, then?

Predicting student grades is a near-impossible task for teachers

  • Work by Anders et al (2020) highlighted the difficulty of predicting grades accuracy. In this study, the authors attempted to predict A level grades using detailed administrative data on student prior achievement (GCSE) and both statistical and machine learning techniques. Their models could correctly predict 1 in 4 pupils across their best three A levels, versus 1 in 5 for teacher predictions (based on Murphy and Wyness, 2020).
  • Their predictions were incorrect for 74% of pupils.

That’s not great. What else do we know?

Certain pupil types appear harder to predict than others

  • Anders et al also found that high achieving pupils in comprehensive schools were more likely to be underpredicted by their models, compared to their grammar and private school counterparts. This highlights the difficult task that teachers face each year, particularly for pupils with more variable trajectories from GCSE to A level.

Can’t we remove the teacher and calculate grades based on past performance?

The ‘calculated grades’ for 2020 are not just based on teacher predictions.

  • Schools have provided predicted grades and pupil rankings (which are known to be easier to produce than predicted grades).
  • These predicted grades may also be more accurate than in previous years, since teachers were given better guidelines on how to predict, and what information to use
  • Ofqual will standardise teachers’ predicted grades according to the centre’s historical performance. this will reduce the tendency towards overprediction that all studies of predicted grades have observed. For example, if a school historically awards 60% of Bs on average, they will be expected to do so this year, and grades will be downgraded to reflect this.
  • But teachers’ rankings will be preserved so that pupils cannot “change places” after the standardisation.

Scotland have promised to re-think standardising results based on the school. What will happen in England?

  • It’s a controversial point. Our paper shows that high-achieving comprehensive school pupils are more likely to be under-predicted compared to their grammar and private school counterparts.
  • Among high achievers, where under-prediction is most common, the team found 23% of comprehensive school pupils were underpredicted by two or more grades compared to just 11% of grammar and private school pupils.”

What if a student who does less well earlier goes on to study really hard? Isn’t this unfair?

“Outlier” students and disadvantaged students could potentially be disproportionately affected by the standardization process

  • The standardization process could affect outlier pupils more than others
  • For example, an AAA student at a historically low performing school could be downgraded as a result of standardization
  • And a DDD student at a high performing school could be upgraded
  • This could serve to entrench existing socio-economic gaps in pupil attainment to the extent that low SES students are more likely to attend historically low performing schools, and high SES students are more likely to attend high performing schools

So what should we do about it?

The cancellation of exams this year has highlighted that the system of using predicted grades as a key part of the university application process urgently needs reform.

  • the research above highlights that predicting student grades, even removing teachers from the equation, and instead using detailed data on pupils’ past achievement is a near-impossible task.
  • A better solution would be to reform the university applications system and allow students to apply to university after they have sat their exams
Gill, T., & Benton, T. (2015). The accuracy of forecast grades for OCR A levels in June 2014. Statistics Report Series No. 90. Cambridge, UK: Cambridge Assessment.
Delap, M. R. (1994). An investigation into the accuracy of A‐level predicted grades. Educational Research, 36(2), 135-148.
Everett & Papageorgiou (2011), “Investigating the Accuracy of Predicted A Level Grades as part of 2009 UCAS Admission Process”, BIS Research Paper No 37, Department for Business, Innovation and Skills, London.
Murphy, R., & Wyness, G. (2020). “Minority Report: the impact of predicted grades on university admissions of disadvantaged groups”. Education Economics, 1-18.
UCAS (2015). “Factors associated with predicted and achieved A level attainment”, University and College Admissions Service, Gloucestershire

Staying ambitious, motivated and focused during Year 11 is worth half a grade per GCSE subject

IOE Editor20 July 2020

New research shows the impact that drive and ambition can have

By Professor John Jerrim 

Much has been written recently about learning loss and the COVID-19 crisis. With the country locked down and schools shut, some children are bound to have learned less over this period than others.

It has been noted that learning loss is likely to particularly affect the life chances of young people with high-stakes examinations next year, such as those entering Year 11 in the autumn.

All other things being equal, those who are motivated, driven to succeed and ambitious are likely to have continued to work hard in their studies even while their school was closed. On the other hand, those pupils who are less motivated and have no clear post-school targets or plans, may well have taken their foot off the pedal.

But how much does being driven and ambitious during Year 11 really matter for GCSE outcomes, even in normal times?

Quite a lot, it turns out.

Unique data

New research of mine (along with my colleagues Nikki Shure and Gill Wyness), published today, looks at a nationally representative cohort of Year 11s who took their GCSEs in 2016, and considers their ‘drive’ (e.g. how they responded to questions such as ‘I want top grades in most or all of my courses’ and ‘I want to be the best, whatever I do’) and their ambition (measured by whether they want to go to university and, if so, which one they want to attend). They answered these questions in November/December of Year 11 – around six months before taking their GCSEs.

From this, we can compare GCSE outcomes for Year 11 pupils who score highest on these measures (e.g. who say they want to be the best at what they do and plan to apply to an Oxbridge university) to their school peers who lack any such motivation. Importantly, we can also account for a wide array of background differences between such teenagers, such as their levels of prior achievement, socio-economic background and the school that they attend.

The results show that drive and ambition really do matter during Year 11. Ambitious and driven young people achieve – on average – around half a grade higher per subject in their GCSEs than comparable Year 11s, with the same level of prior achievement, who are not determined to succeed.


What does this finding imply?

First, even during ‘normal’ times, motivation and determination in Year 11 matters a lot. Clearly, there is only so much that schools, teachers and parents can do. At the end of the day, the buck stops with young people themselves.

Second, in the current climate, this provides one clear reason why educational inequalities in GCSE grades may widen next year. During this last term – with schools not open to most children – the onus has been placed upon young people to continue putting in the hours on their school work. The driven and the ambitious pupils will have done this. Those lacking motivation and direction will not have.

It would therefore be no surprise at all if the gap in GCSE outcomes between such teenagers increases dramatically next academic year.

You can read the full research here.

10 things you may not know about educational inequality

IOE Editor15 June 2020

1. There are large inequalities in the home learning environment

Families from lower socio-economic backgrounds may experience challenges in supporting their child’s home learning. For example through:

  • Limited access to resources(including tech devices);
  • Lack of reliable and fast Internet connection;
  • Low levels of parental numeracy and literacy;
  • Anxieties towards learning (especially maths).

Current evidence suggests it is important to focus on the quality of children’s home learning, rather than simply the quantity. 

2. Parental inputs affect early child development

By the time children start school, socio-economic gaps are evident in child skills. Exploring the role of various parental inputs, we find that financial resources are an important channel, explaining up to 59% of the effect on child cognitive skills. Parental investments of health behaviours during pregnancy and monetary investments at home explain a further 14% of the test score gaps.

3. Jobless parents invest less money but more time in their children’s learning

Parents out of work, but with otherwise similar backgrounds to working parents, provide lower monetary investments but more time investments in their children’s learning, such as helping with homework. These findings could help guide future social policy aimed at equalising opportunities for children living in workless households.

4. There are large inequalities in the courses that university students attend, by family background.

We examine inequalities in the match between student quality and university quality. We find that students from lower socio-economic groups systematically undermatch, that secondary schools play a key role in generating these gaps, and that while there are negligible gender gaps in the academic match, high-attaining women systematically undermatch in terms of expected earnings, largely driven by subject choice.

5. There is a great deal of inaccuracy in predicted grades.

Only 16% of applicants’ to the UK University system have predicted grades that are accurate. While 75% of applicants have their grades over-predicted, high-attaining, disadvantaged students are significantly more likely to receive under-predictions. Those under-predicted candidates are more likely to enrol in courses for which they are overqualified than their peers. The use of predicted rather than actual grades has important implications for student’s labour market outcomes and social mobility in general.

6. Non-monetary incentives can improve teacher retention.

The French have a non-pecuniary (non-money based), “career-path oriented” centralized incentive scheme designed to attract and retain teachers in French disadvantaged schools. We find this incentive scheme has a statistically significant positive effect on the number of consecutive years teachers stay in disadvantaged schools and decreases the probability of inexperienced teachers in disadvantaged schools to leave the profession.

7. Teacher’s working hours have remained stable despite initiatives to reduce them

Surveys have revealed that teachers in England work far longer hours than their international counterparts. However, contrary to current narrativeswe do not find evidence that average working hours have increased. Indeed, we find no notable change in total hours, work during evenings and weekends over the fifteen to twenty years. The results suggest that policy initiatives have so far failed to reduce teachers’ working hours and that more radical action may need to be taken in order to fix this problem. The article concludes with a discussion of how official data on working hours could be improved.

8. There are large inequalities in who accesses grammar schools

Inequalities exist in who attains places at grammar schools by socio-economic status, with more disadvantaged children far less likely to attend a grammar school than their more advantaged peers. This is true even when comparing those with similar levels of academic achievement. 

9. Private school choices are based on values, not just money

Given the high and rising fees required to send a child to private school, one might think that the decision is entirely connected with financial resources. However, while these remain an important factor, we argue that other determinants are also important. In particular, we highlight the importance of parental values and geographical proximity to choosing high-quality state school alternatives. 

10. Bullying casts a long shadow on attainment

Both type of bullying and its intensity matters for long-run outcomes such as obtaining a degreeincome, and mental health. We can assess the effects of bullying victimisation on short- and long-term outcomes, including educational achievements, earnings, and mental ill-health at age 25 years.

The Covid-19 crisis and Educational Inequality

IOE Editor22 May 2020

By Professor Simon Burgess, University of Bristol and Professor Anna Vignoles, University of Cambridge

This article was originally commissioned and published by the Campaign for Social Science as part of its Covid-19 programme https://campaignforsocialscience.org.uk/hub-of-hubs-social-sciences-responding-to-covid-19/.

Younger generations will pay a heavy price for our response to this virus. First, their educational opportunities and attainment are being affected by lockdown, variable home-learning facilities, and changing assessment methods. Second, leaving school in a recession is always harder, and the coming recession is likely to be worse than for many years. Whilst the health impact of COVID on older people is more severe, young people are vulnerable educationally and in their long-term employment prospects. The crisis will lay bare the already stark inter- and intra-generational inequalities in educational attainment that are a feature of the UK.

Learning loss from school closures

The decision to impose a school lockdown has been taken in most (but not all) countries and is the result of weighing the health risks to pupils (and their families) and teachers against the loss of skills and growing inequalities. With schools shut, the plan was for learning to take place remotely. But the amount of time children are spending on school-work varies enormously both by school and by parents’ ability to support remote schooling. Recent work from the UK[1], Ireland[2] and the Netherlands[3] all illustrate the factors behind growing educational inequality.

It is difficult to quantify the extent of the likely attainment loss. Extrapolation from studies that have looked at differences in the number of hours of instruction a pupil receives in different countries may guide us. Lavy[4] found an additional hour of instructional time in a subject per week over the school year was associated with a gain in test scores of 6% of a standard deviation. Carrlson[5] also estimate the impact of more days to prepare for a test. This is relevant to the current crisis as most pupils are not receiving the same hours of schooling remotely that they did face to face, irrespective of the quality. If students lose 3-4 hours of each main subject per week for a term, this would equate to about an hour over the academic year, i.e. 6% of a standard deviation[6]. The actual loss of learning will vary by context, depending on what schools and families have been able to provide. We also know that the earlier years of a child’s life are pivotal for their development, and investment during this time is particularly valuable, in terms of improving their cognitive and non-cognitive skills. Hence the negative impact from a lack of face to face school provision is likely to be particularly large for younger children.

Learning loss will be worse for socio-economically disadvantaged students.  Household income and family environment are major determinants of children’s academic achievement in normal circumstances. Socio-economically advantaged parents also tend to compensate for any deterioration in schooling to a greater extent. A recent Sutton Trust survey[7] suggests that during this crisis 44% of middle-class parents are spending more than 4 hours a day on homeschooling. One-third of working-class parents are doing so.  Andrew et al reported similar findings: children in the richest quintile of families spend over 75 minutes per day more on schoolwork than children in the poorest quintile of households. This quickly accumulates: over the (at least) 34 days that schools will be closed, this difference adds up to more than 7 full school days. For some year groups, particularly those nearing end of schooling, this might have a major impact on their attainment.

Post compulsory education

Even in normal times, the 50% of students who do not go on to university are less likely to find a good job and more likely to have lower wages. Of course, this crisis has taught us that many of those low wage jobs are also hugely socially valuable. So policymakers should not measure success by what people earn in the labour market. That said, the group who leave school with very low-level qualifications will find it particularly difficult in this tumultuous labour market.

The labour market prospects of students will vary hugely depending on what qualifications they get. It is perhaps striking that the vocational qualifications with the greatest firm involvement, such as apprenticeships, are some of the most economically valuable. Given the financial hit to firms, and the fact that many have had to suspend their apprenticeships, this is a route that will be badly affected by the COVID crisis. It is imperative that we support firms to restart their training. Unfortunately, training in the UK is pro-cyclical: firms train less in bad times. So policy intervention will be essential.

For students who would normally be university-bound, there are still problems. The fear is that, despite the poor labour market which should encourage students to remain in education, there will be a sharp decline in students going on to HE in 2020/21 due to safety fears or unwillingness to undertake remote learning. However, the main concern is that students from poor backgrounds may miss out. First, they’re A level grades may be negatively affected because COVID has meant that teacher judgement will play a major part in the grade they are awarded, and there is evidence that teachers under-estimate grades of poor students[8]. Even if universities are willing to accept students with somewhat lower grades (given spare places due to the decline in international student numbers), being awarded a low grade may affect the motivation of such students to progress. But the bigger issue is that poorer students are more likely to fund their studies with paid work. Many work in the sectors that are being hardest hit.

More positively, with empty places at universities, this is an opportunity for students from lower-income backgrounds not only to go to university but also to attend a higher-ranked institution than they might otherwise have done. This could be an opportunity to widen participation,

Policy options

So what can policy do? Interventions are needed to support socio-economically disadvantaged pupils and those who have fallen behind academically.  The decision on when schools should return is both complex and emotionally charged. As well as the obvious health risks, each additional month of sub-optimal learning is likely to widen the socio-economic gap in achievement, over and above any loss for the whole cohort.

Younger years should be prioritised, given the importance of the earlier years. Additional support on return to school will be required to address learning loss. Small-group tuition has been found to be effective, but it is expensive, and the pupil premium paid to schools to support disadvantaged children would have to be supplemented. We also need to be mindful that a far greater number of students will now be in families with economic difficulties.

The FE and HE sectors need to communicate strongly that they are open next year and undertake outreach to encourage disadvantaged students to continue their studies. The government needs to work with firms to find ways to continue apprenticeships or to move them in the case of firm failure. We need to invest in the post-compulsory schooling of these vulnerable cohorts, targeting the most disadvantaged students. We know that the long-run negative scarring effect from leaving education during a recession is significant, and we should encourage students to shelter from the worst of it in education and acquire useful and productive skills as they do so.

These policy solutions imply increased spending in an era when government finances are going to be tested. Despite this, investing in the human capital of the young should be central to any economic recovery plan. However, the socio-economic gap in educational achievement was large and persistent before COVID, despite attempts to reduce it. This is because drivers of the disparity in achievement have worsened, e.g. child poverty. COVID has brutally revealed the health inequalities in our society, so too it illustrates the educational ones that have arisen for similar reasons. Perhaps this unprecedented crisis is a time to rethink how we go about reducing deeper economic inequalities that underpin these problems.

[1] Montacute, R. (2020) “Social Mobility and COVID-19”. Sutton Trust Report https://www.suttontrust.com/wp-content/uploads/2020/04/COVID-19-and-Social-Mobility-1.pdf
Andrew, A., Cattan, S., Costa Dias, M., Farquharson, C., Kraftman, L., Krutikova, S., Phimister, A. and Sevilla, A.  (2020) ‘Learning during the lockdown: real-time data on children’s experiences during home learning’ IFS Briefing.Note. https://www.ifs.org.uk/publications/14848
[2] Doyle, O. (2020) “COVID-19: Exacerbating Educational Inequalities? “Public Policy Ireland. http://publicpolicy.ie/papers/covid-19-exacerbating-educational-inequalities/
[3] Bol, T. (2020) “Inequality in homeschooling during the corona crisis in the Netherlands. First results from the LISS panel” https://osf.io/preprints/socarxiv/hf32q
[4] Lavy, V (2015), “Do Differences in Schools’ Instruction Time Explain International Achievement Gaps? Evidence from Developed and Developing Countries”, Economic Journal 125.
[5] Carlsson, M., Dahl, G. B., Öckert, B. and Rooth, D. (2015) “The Effect of Schooling on Cognitive Skills” Review of Economics and Statistics vol. 97(3) pp. 533–547.
[6] Burgess, S. and Sievertsen, H. (2020) Schools, skills, and learning: The impact of COVID-19 on education. VOX-EU.
[7] Sutton Trust (2020) COVID-19 Impacts: School Shutdown.
[8] Murphy, R. and Wyness, G., 2020. Minority Report: the impact of predicted grades on university admissions of disadvantaged groups (No. 20-07). Centre for Education Policy and Equalising Opportunitites, UCL Institute of Education.

Lockdown and the 11-plus

IOE Editor6 May 2020

By Dr. Matt Dickson and Professor Lindsey Macmillan

Unlike GCSEs, A-levels, SATs, Scottish Nationals and Scottish Highers, the secondary transfer test (otherwise known as the 11-plus exam) is the only high-stakes school assessment in Britain that is still scheduled to take place, as usual, this year. The test, taken by students beginning year 6 in September of each year, is the primary way in which places at grammar schools are allocated. The top performers on this test are offered a place at a state-funded grammar school, while those below the cut-off threshold attend state-funded comprehensive or secondary modern schools depending on the area. There are currently 163 grammar schools in England, educating around 5% of state secondary school pupils, and selecting their pupils according to their performance on this ‘11-plus’ test.

Why does this ‘business as usual’ approach to this particular exam matter? It matters because we already know from the extensive research literature in this area that access to grammar schools is strongly related to socio-economic status, with more disadvantaged pupils far less likely to attain a grammar school place than their more advantaged peers. This remains true even when comparing those with similar levels of academic achievement. There are numerous reasons for this inequality in access, and many of them will be exacerbated during the current COVID-19 pandemic and resulting lockdown. As a consequence, if the ‘11-plus’ goes ahead as planned, with no account taken of the differential impact of the lockdown on children from different backgrounds, the inequality in access this year is likely to be more extreme than ever.

How unequal is access?

Official statistics show that in 2019 only 3% of grammar school pupils were entitled to free school meals (FSM), compared to the 15% of pupils in non-selective schools across England. This isn’t a perfect comparison as grammar schools are not equally distributed around the country, so this low FSM percentage might just be reflecting the types of areas that grammars tend to be located in. This is partly true, however even when comparing grammar school pupils just to the other children in the same area, stark differences remain: a recent study found 2.5% of grammar pupils are eligible for FSM, compared to 8.9% amongst the other pupils in the area. This disparity is a consistent finding, echoing earlier figures from 2013 and 2006.

Using the binary FSM division is a useful way of contrasting access probabilities but can only tell us about the inequality between (roughly) the lower 15% of the income distribution and the upper 85%, potentially masking important differences in access chances among the children in the middle. An alternative is to look at grammar school place probability across the full socio-economic spectrum and this is what was done in another recent study, using a socio-economic index measure.[1] This index measure is divided into percentiles allowing the probability of attaining a grammar school place to be computed at each point in the socio-economic status (SES) distribution.

This shows that access increases almost linearly with the SES index for the most part, before steepening in gradient in the top quintile of the distribution (see Figure 1). At the 10th percentile of the distribution, only 6% of children attend a grammar school. This increases slowly such that, at the 40th percentile – the ‘just about managing’ families – 17% of pupils attend a grammar. By contrast, 51% of children at the 90th percentile attend a grammar school and 79% of those in the top 1% most affluent families attend a grammar school. In total, half of the grammar school places are taken by the best-off quarter of families.

Notes: Figure 1 from Burgess, Crawford and Macmillan (2018)

Part of this social gradient is driven by the large differences in attainment at age 11 between children from different family backgrounds. Achievement gaps between children from the most and least disadvantaged families open early in childhood and widen through primary school. Gaps in cognitive test scores between children from more and less disadvantaged children are observed as early as age 3 and by the time they hang their coat on their peg for the first time at primary school, children from low- and middle- income families are five months behind children from high-income families in their vocabulary skills. This gap increases through school from Key Stage 1 at age 7 to Key Stage 2 at age 11, at which point pupils from the most disadvantaged families are (on average) over 20 percentiles behind pupils from the most advantaged families in their performance ranking. It is not surprising therefore that we see such a steep gradient in grammar access by SES.

However, even comparing children with the same achievement, there remain large differences in the probability of accessing a grammar school place in selective areas, depending on family socio-economic status. Splitting combined performance on Key Stage 2 (age 11) tests in English, maths and science, into percentiles (1=lowest score; 100=highest score), the chances of grammar attendance for children with the same level of performance but different family backgrounds can be compared. Figure 2 shows that very few pupils in the lower half of the performance distribution go to grammar schools, whatever their socio-economic background. For the upper half, at every point in the performance distribution, there is a clear socio-economic gradient in the probability of attending the grammar school. For example, at the 80th percentile of attainment, the best-off families have a 70% chance of attending a grammar, compared to only 25% for children from the worst-off families.

Access to grammar school places is very strongly related to family background and this remains the case even when comparing children with the same achievement on national tests at age 11. Whatever advantages grammar school attendance conveys, it is very much concentrated on pupils from more affluent backgrounds.

Notes: Figure 2 from Burgess, Crawford and Macmillan (2018)


What factors lead to this disparity in access even for children with the same attainment at age 11 and how will lockdown affect these?

There are a number of reasons why children from disadvantaged backgrounds have lower achievement than their more advantaged peers, and many relate to disadvantaged families facing more constraints in terms of both their resources and their time – constraints that are likely to be tightened during lockdown. Recent research reveals that higher maternal education is associated with better child outcomes in part because it leads to an increase in income but also because it is associated with greater educational resources available in the home during a child’s early life, improving cognitive skills at ages 5 and 7. Similarly, it has been shown that mothers with university degrees spend a higher proportion of time engaging with the child’s learning at home, compared to mothers with no qualifications, and this is linked to increased child literacy and socio-emotional outcomes between ages 3-7 years. Emerging findings on home inputs during lockdown suggest that these gaps in the home learning environment are evident in the ‘homeschooling environment’ too: there are significant concerns over access to electronic devices for learning and the internet. 15% of teachers from deprived schools reported concerns that a substantial portion of their students would not have access to online learning, compared to only 2% of teachers from the most affluent state schools. There are also differences in terms of how confident parents are about helping their children, with more educated parents much more likely to report they feel confident in directing their children’s learning.

These barriers in terms of the home environment are exacerbated by the investment that most advantaged parents make in their children’s education in the form of extra-curricular tutoring. More advantaged parents are more likely to invest in extra English and maths lessons and arrange tutoring or coaching. This is particularly pronounced in selective areas, and in the subjects that are core to the ‘11 plus’ examination (but not in science, which is not an ‘11 plus’ subject), supporting the view of grammar school headteachers that children from more affluent, middle-class families are coached to pass the entrance exam. This inequality enhancing investment and coaching happens in every year, but the emerging evidence from the Sutton Trust suggests that this continues even more so this year: children in households earning more than £60k are currently twice as likely to be receiving tutoring during school closure as those children in households earning under £30k.

In sum, the evidence suggests that all of these barriers will be more pronounced for the current cohort of year 5s who are due to sit the ‘11 plus’ examination in September 2020. The current school shutdown is very likely to widen the achievement gap between the most and least disadvantaged pupils with direct impacts on who accesses grammar schools.

What are the alternatives to ‘business as usual’?

Unlike other high stakes exams where alternative methods are available to assign grades – i.e. coursework grades, module marks, teacher evaluations – this route is not really feasible for the ‘11-plus’ exam. While all pupils in areas with selective schools are eligible to take the test, state primary schools are not allowed to spend time directly preparing children for the test and the test itself is standalone rather than being part of the child’s profile of work within the school year, making it difficult to transpose other assessments into a predicted outcome of the test. Moreover, teachers would be reluctant to risk undermining relationships with local parents if the future school destination of pupils – and everything this bifurcation entails – is determined solely by the teachers’ assessment.

Two policies that could be implemented, with or without the deferral of the exam date, are the provision of a ‘pupil premium’ type of payment/voucher to allow lower-income families to access additional tutoring in English and maths for their year 5 children. This would help to mitigate some of the resource constraints faced by disadvantaged families, although this still raises questions over methods of delivery with social distancing going nowhere fast. Another option, and one that is becoming more widely accepted in access to higher education, is the contextualising of marks on this year’s test, taking into account the socio-economic circumstances of each child. Marks are already adjusted in some settings to account for the month of birth of the child (adjusting up the younger summer-born kids). This type of explicit adjustment, based on socio-economic status, would go some way to acknowledging the differential experiences of these children in the lead up to this important junction in their education path.


Burgess, S., Crawford, C. Macmillan, L. 2018. Access to grammar schools by socio-economic status. Environment and Planning A: Economy and Space, vol. 50(7): 1381-1385.

[1] The index is constructed from the index of multiple deprivation (IMD) scores, A Classification of Residential Neighbourhoods (ACORN) categories (based on the socio-economic characteristics, financial holdings and property details of the 15 nearest households), and the proportion of the nearest 150 households working in professional or managerial occupations, with education at Level 3 (post-compulsory) or above and who own their own home, in addition to FSM eligibility.

Why wait until clearing to improve information available to students?

IOE Editor5 May 2020

By Dr. Gill Wyness

Incongruously announced in yesterday’s university support package, alongside the return of numbers caps and the bringing forward of quality-research related funding and tuition fee payments, is UCAS’ new Clearing Plus system. Clearing Plus is a new service which “matches students to universities or other opportunities based on their achievements and course interests.”

The idea of improving the match between students and universities sounds promising. Research by CEPEO’s Gill Wyness and Lindsey Macmillan, with colleagues from UCL Institute of Education and University of Texas at Austin, found a significant amount of mismatch in the UK system. Comparing undergraduate students’ qualifications to those of their fellow students revealed that around 15% of students were undermatched (attending a course that was less selective than expected, given their A-level results) and a similar amount were overmatched (attending a course that was more selective than expected).

Of course, we have no information about the preferences of students, and it may be the case that undermatched students simply preferred to go to the “less selective” course for a myriad of reasons. But, our results also revealed that students from disadvantaged backgrounds were more likely to undermatch and less likely to overmatch, choosing less academically selective courses than their more advantaged counterparts at every point in the attainment distribution. In other words, taking two students who had exactly the same A-level grades, no matter if they were three As or three Es, the poorer student would end up in a less selective course than the richer student. We even accounted for how difficult the A-level subjects were in our modelling, and this result still held.

When disadvantaged students behave in a way that is so systematically different from richer students, this has to ring alarm bells for social mobility. And interestingly, we found no evidence that geography – often cited as the key issue behind mismatch – was a factor in these socio-economic gaps. Nor was subject choice responsible. School attended appeared the biggest factor, suggesting that information provided by schools could be improved.

So, on this basis, Clearing Plus sounds promising. As part of a new “personalised Clearing system for students,” it will be produced in partnership with BBC Bitesize, and will allow unplaced applicants to “sign in to see their individual list of matched courses, and easily send an expression of interest to a university. Universities can then contact interested students, who will be able to add a new course to their UCAS application.”

Given our findings that mismatch is fairly prevalent in the UK, it seems strange that this new service will only available to those students who enter clearing. UCAS have pointed out that Clearing Plus can also be used by students who already have a confirmed place: however, to use the service, they have to first “self-release” into clearing. This sounds risky, especially this year, when students are already confused about the calculated grades system, and worried about their chances of securing a place. We also know from existing research that low SES students are more risk averse, so they may be less likely to take this risk, which could even exacerbate the SES gaps in match.

Surely a better idea would be to allow all students to avail themselves of a matching tool before they make their initial university choices, rather than once they’ve found themselves in clearing? We make this exact suggestion in our report on mismatch for the Nuffield Foundation, suggesting that the UCAS application service could be used to make course suggestions to students, based on their grades and subject preferences (and potentially other preferences such as location) at the point of application.

Even better, of course, would be a service which allows students to see their matches long before they start the application process. The cabinet nudge unit had a successful trial, in which high achieving GCSE students at schools which sent high proportions of students to their local university were sent a letter from a previous student at their school, encouraging them to apply to a better matched university, given their grades. There has also been successful work in this kind of information and guidance in the US.

Why have we waited until now to improve the accuracy of predicted grades?

IOE Editor3 April 2020

By Dr. Gill Wyness

For those students expecting to take their A-Levels and BTECs this summer, the impact of COVID-19 will be profound. Instead of taking the formal examinations that they were preparing for, Ofqual confirmed today that school leavers will be provided with a set of grades based on teacher judgement, which will, in turn, form the basis of their university applications. This plan has attracted a fair amount of criticism, with fears that the system may be biased, and might lead to certain groups of students missing out on a university place because of a bad prediction.

But it is worth noting that this is already how students apply to university, so it is perhaps surprising that there is suddenly such widespread resistance to the idea of predicted grades. However, my recent study with Richard Murphy (University of Texas at Austin) suggests that fears that these predicted grades might be inaccurate may be well-grounded.

The UK’s system of university applications has the peculiar feature that students apply to university on the basis of predicted rather than actual exam grades. In fact, only after they have applied, received offers, and shortlisted their preferred two courses do students go on to sit their A-level exams. If the student achieves grades in line with the offer (i.e. grade requirement) of their chosen university course, the course is bound to accept the student and the student is bound to go. If the student misses their offer (i.e. fails to achieve the grades their course required) the course may still accept them, or they may need to enter a process known as ‘clearing’ (in which courses which still have places available are advertised which unplaced students can then apply to) in the hope of gaining access to a place on a course that still has vacancies. In short, A-level predictions are actually a very important feature of the university admissions system.

Surprisingly then, little is known about how accurate these predictions are, largely due to data constraints. Our study uses aggregate data on university applications to study the accuracy of predicted grades and to examine where students with different predictions end up. Our results show that only 16% of applicants achieve the A-level grade points that they were predicted to achieve, based on their best 3 A-levels. And the vast majority of applicants are over-predicted – i.e. their grades are predicted to be higher than they actually go on to achieve. This is in line with other related work (e.g. Dhillon, 2005).

We also find evidence of socio-economic (SES) gaps in predicted grades: among the highest achieving students, those from disadvantaged backgrounds receive predicted grades that are slightly lower than those from more advantaged backgrounds. This may have consequences for social mobility since under-predicted students are more likely to be overqualified for the course they eventually enrol in.

One potential explanation for the inaccuracy of these grades is that, to date, the guidelines given to teachers have been lacking. Information on the UCAS website advises that “a predicted grade is the grade of qualification an applicant’s school or college believes they’re likely to achieve in positive circumstances”. UCAS also suggest that predicted grades should be “aspirational but achievable”, but that ‘inflated’ predictions are “not without risk, and could significantly disadvantage [applicants]”.

These guidelines are confusing at best, and it may not be surprising that predictions are typically inaccurate. Moreover, teachers may be using predictions as an incentive, a target for students to try and meet, rather than as a true picture of their ability. This is one explanation for the high degree of overprediction we observe.

So, what does this mean for the students who will receive estimated grades this year (and who unlike previous students, will never learn what their grades would really have been on the day)?

If we see the same patterns of over-inflation in predictions, this could result in there being a significant increase in the number of students qualified for university (i.e. those who previously may have missed their offer). This could prove tricky for university admissions staff, who may decide to add additional criteria when choosing students. This could lead to universities instigating their own entry tests.

However, a crucial difference in the grade predictions being made this year is that they are already under much more scrutiny than in previous years. Ofqual has already given detail on how grades should be judged and exam boards will be required to “put all centre assessment grades through a process of standardisation using a model developed by Ofqual”.

So we might actually end up with a fairer system than the one we have been using for the last 50 years.  Which begs the question of why have we waited til now to improve the accuracy of predicted grades, bearing in mind just how high stakes they are.

Dr. Gill Wyness is Deputy Director of CEPEO and a Research Associate with the Centre for Economic Performance at LSE.