X Close

IOE Blog

Home

Expert opinion from IOE, UCL's Faculty of Education and Society

Menu

A-level debacle has shattered trust in educational assessment

By Blog Editor, IOE Digital, on 18 August 2020

Students protest against A-level results, August 16 2020.
I. Salci/Shutterstock

Mary Richardson, UCL, first published on The Conversation

After five days of uncertainty and anxiety, Education Secretary Gavin Williamson announced on August 17 that students in England would be awarded their centre assessment grades (CAGs) this summer – that is, the grade their school or college expected they would most likely have attained had they taken their exams – or their moderated grade, whichever was higher.

The announcement follows widespread outrage after it emerged that the poorest students were hardest hit by the inadequacies of the algorithm used to moderate their grades.

Collective sighs of relief were palpable as teachers no longer faced the stress of an appeals process while also preparing to start one of the most complex and challenging years of their careers. Students, however, (more…)

Covid-19 and education: Why have we waited until now to improve the accuracy of predicted grades?

By Blog Editor, IOE Digital, on 3 April 2020

Gill Wyness

For students expecting to take their A-Levels and BTECs this summer, the impact of COVID-19 will be profound. Instead of taking the formal examinations that they were preparing for, Ofqual confirmed today that school leavers will be provided with a set of grades based on teacher judgement, which will, in turn, form the basis of their university applications. This plan has attracted a fair amount of criticism, with fears that the system may be biased, and might lead to certain groups of students missing out on a university place because of a bad prediction.

But it is worth noting that this is already how students apply to university, so it is perhaps surprising that there is suddenly such widespread resistance to the idea of predicted grades. However, my recent study with Richard Murphy (University of Texas at Austin) suggests that fears that these predicted grades might be inaccurate may be well-grounded.

The UK’s system of university applications has the peculiar feature that students apply to university on the basis of predicted rather than actual exam grades. In fact, only (more…)

This year, for the first time ever, more young women than men took science A Level … but it’s not yet time to celebrate

By Blog Editor, IOE Digital, on 27 August 2019

Emily MacLeod

On A Level results day earlier this month it was widely reported that girls had overtaken boys in science A Level entries for the first time ever. Female students accounted for 50.3% of all A Level science entries across the UK, compared to 49.6% last year. As part of a research team aiming to understand, and make recommendations for, increasing and diversifying participation in the sciences I welcome this news.

As much of the media coverage suggested, increasing participation in STEM (Science, Technology, Engineering, Mathematics) subjects has long been a national priority. However, despite considerable efforts and expense to make the sciences more equitable, the status quo of science being male-dominated has proven, until now at least, resistant to change – and this year’s milestone (more…)

It’s time to ‘open up physics’ if we want to bring in more girls and shift the subject’s declining uptake  

By Blog Editor, IOE Digital, on 15 August 2018

Rebekah Hayes. 
Despite numerous campaigns over many years, getting more students to study physics after GCSE remains a huge challenge. The proportion of students in the UK taking physics at A level is noticeably lower than those studying other sciences. This low uptake of physics, particularly by girls, has implications not only for the national economy, but for equity, especially as it can be a valuable route to prestigious, well-paid careers.
The latest research from ASPIRES 2 explores why students do or do not continue with physics by focusing on students who could have chosen physics, but opted for other sciences instead.
ASPIRES 2 is a 10-year longitudinal study, tracking children’s science and career aspirations from ages 10–19. This briefing focuses on data collected when students were (more…)

GCSE and A-level results: it’s not just the grades that matter

By Blog Editor, IOE Digital, on 15 August 2017

File 20170810 27655 1a279l5
Why GCSE and A Level subject choices matter. shutterstock

Jake Anders, UCL and Catherine Dilnot, Oxford Brookes University. 
A-level results will soon be out, with more than 300,000 students eagerly waiting to find out if they’ve made the grade. Then come GCSE results, with even more students keen to find out how they’ve done.
Whether students are heading to university, into an apprenticeship or straight into employment, chances are they will all be wishing and hoping and dreaming and praying of a set of grades that will reflect their level of academic accomplishment.
For would-be university applicants, there is often a requirement that students take a particular set of subjects at A-level – and achieve a certain grade – to be in with a chance of getting a place on a degree course. To study medicine, for example it’s often required that (more…)

Science and mathematics education for 2030: vision or dream?

By Blog Editor, IOE Digital, on 1 July 2014

Michael J Reiss
After three years of work and nine commissioned reports, the Royal Society has published its vision for science and mathematics education. It may not push Luis Suarez or Andy Coulson off the front pages but this is a most impressive document that deserves to have a major and long-lasting impact on UK science and mathematics education policy.
The committee that produced the report features a list of intellectual and society heavyweights – if you don’t have a knighthood, a dameship or a Nobel Prize or you aren’t a Fellow of the Royal Society, that may explain why you weren’t invited to sit on it. Behind these titles sits a huge amount of expertise and very considerable passion to improve education.
The Vision aims to raise the general level of mathematical and scientific knowledge and confidence in the population by focusing on changes to how science and mathematics are taught to 5- to 18-year-olds. Some of its recommendations are already taking place, at least to some extent – for instance, that teachers should be trained to engage fully with digital technologies – but others are more contentious.
For example, the report calls for a move away from the current A level system to a Baccalaureate. Such a move would benefit not only science and mathematics but other subjects too. However, I won’t hold my breath to see if it happens – and it will certainly require a change of government. People have been calling for A levels to be replaced by a system with less early specialism for longer than I can remember.
The report also calls for the establishment of new, independent, expert bodies to provide stability in curriculum and assessment and allow teachers space to innovate in their teaching. Following the bonfire of the quangos after the last General Election, the need for such bodies has become clearer than ever. But who is to pay for them? This is not a report overburdened by economic analysis (there isn’t any). Perhaps the Royal Society and other funders need to step in and establish something akin to the successful Nuffield Council on Bioethics, which manages to be independent yet shapes national policy and practice.
Science and mathematics education are in a fortunate position in the UK, compared to many other subjects. Industry clamours for more STEM (science, technology, engineering and mathematics) graduates and technicians and the UK is an acknowledged world leader in STEM research. A decade ago, work by David Sainsbury, Alan Wilson, John Holman, Celia Hoyles and others helped turn around a long-running decline in the numbers of 16-year-olds choosing A levels in mathematics and the physical sciences. Let’s hope this report takes those successes to the next level.

What’s in an A-level score? The new floor targets for post 16

By Blog Editor, IOE Digital, on 10 May 2013

Chris Husbands
The government is currently looking long and hard at the school accountability framework. In February, it published a thoughtful consultation document on Key Stage 4 accountability, and a similar document on Key Stage 2 is expected shortly: the headline performance measures for schools have always been the Key Stage 2 expectations for primary schools and the Key Stage 4 measures for secondary schools: the focus on Level 4 performance (at Key Stage 2) and threshold GCSE performance at grades C and above (at Key Stage 4) have simultaneously focused minds and energy whilst at the same time driving some behaviours in schools which mean that resource and effort is focused on marginal performance at critical boundaries. Nonetheless, the focus on floor targets has been a powerful driver for improved performance, especially in English and Mathematics.
With little fanfare, the government has now published minimum performance standards for ‘Key Stage 5’ – that is, for 16-19 providers. The performance standards are long overdue: there is too much poor and often unviable provision at 16-19, and comparatively little sustained scrutiny of performance across the sector. The government is right to develop common expectations covering schools and colleges, and to try to develop indicators which assess performance in A-Levels and other academic and vocational qualifications taken at level 3. But at the same time that it is consulting intelligently about key stage 4 accountability, it appears to have developed indicators which will drive some perverse behaviours at key stage 5. The KS5 minimum standard will describe a school sixth form or college as underperforming if its results show that fewer than 40 per cent of students achieve an average point score per entry in academic or vocational qualifications equal to the fifth percentile of providers nationally.
The key flaw is simple, but technical. The current KS5 performance tables present two sets of data on institutions’ achievement: an average points score per student, and an average points score per entry. The points score is derived from the national points tariff – 300 points for an A* at A-level, 270 for an A, 240 for a B and so on, and a parallel tariff for approved vocational qualifications. However, the KS5 minimum standard is set at an average points score per entry, not per student. The perverse incentives can be easily illustrated: imagine a student predicted to score CCE at A-level. She has an average points score per entry of 190 (570/3). But if the school were to counsel her to drop the subject in which she is predicted an E, her average score per entry rises to 210: the measure has shifted, but the performance of the school or college has not. In this instance, it’s not clear that the interests of the student (narrowing her curriculum) are best served by the tactic which is in the best interests of the institution. Of course, this is based on a single case, but some institutions are managing very small cohorts: almost 600 institutions have cohorts of less than 125 students. Given the indicator – the points score secured by fewer than 40% – institutional behaviour of this sort could make a difference.
The relationship between the average points score per student and the average points score per entry is strong: that is, schools and colleges which have high average points score tend also to have a high average score per entry. The graph sets out the relationship based on A-level scores in 2012, with the red line indicating the lowest quintile of institutions. This is partly a consequence of a strongly selective post-16 structure in which some institutions set relatively high entry requirements at GCSE – and note that the DFE KS5 floor target is a norm-referenced measure against the performance of the sector as a whole, rather than a progress measure from 16, for which the data does exist. But the relationship is not absolute, and is weakest in the lowest quintile of performers, again suggesting considerable scope for institutional response to perceived signals in the accountability regime.
Relationship between average A-level score per entry and average A-level score per student, 2012:
graph
It would be relatively easy to replace the planned per entry indicator with a per student indicator. As the graph indicates, this would be neutral for most institutions, but it would send important signals to those institutions that may be at risk of receiving a notice to improve: it is students who matter.