Questions about PISA 2018, part 2: Did certain schools select out of the study in England and Northern Ireland?
By IOE Blog Editor, on 22 April 2021
In my new paper released today, and forthcoming in the Review of Education, I report what I consider to be a worrying lack of transparency surrounding some aspects of the reporting of the PISA 2018 data for the UK.
This blog is the second in a series published today looking into some of the issues. Here I focus upon the non-response bias analysis conducted in England and Northern Ireland – but that didn’t get reported.
Non-response bias analysis
PISA requires that 85% of sampled schools agree to take part in the study. If a country falls below this level then a “non-response bias analysis” must be conducted. The fear is that certain types of schools (e.g. those with lower-achieving pupils) may be more likely to not participate – leading to biased results.
In PISA 2018, both England (72%) and Northern Ireland (66%) failed to meet the OECD’s school response rate criteria.
But, rather than publish these bias analyses in full, the PISA reports published by the OECD, England and Northern Ireland simply stated their interpretation of the results.
For instance, in England it was stated how the:
“analysis investigated differences between responding and non-responding schools and between originally sampled schools and replacement schools. This supported the case that no notable bias would result from non-response.”
While all that was said for Northern Ireland was that:
“The results of both NRBAs [non-response bias analyses] were positive meaning that the samples for UK and NI were representative and not biased.”
What on earth does this mean!? How was this analysis conducted and by whom? What does “positive results” mean? Who exactly reviewed this document? How was this judgement reached?
I have ended up having to use freedom of information legislation to find out exactly what was going on.
A summary of one of the key points coming out from this can be found in Table 1, comparing historical GCSE achievement of responding and non-responding schools in England. As this table clearly illustrates, schools with historically lower-levels of achievement tended to be less likely to respond.
Personally, I don’t view as being “positive” or as indicating that the sample is “not biased” in any way.
Table 1. Prior achievement of responding and non-responding schools in England.
Table 2 presents the equivalent investigations that were conducted in Northern Ireland (also obtained via a freedom of information request). As this illustrates, in Northern Ireland, non-grammar schools (who tend to have lower-achieving intakes) were also less likely to respond.
Yet, otherwise, the “bias analysis” conducted seems to be extremely limited. Only a handful of school characteristics are considered, most of which are not strongly associated with pupil’s academic achievement. In other words, the analyses undertaken – particularly in the case of Northern Ireland – do not really provide much information as to whether the high levels of school non-response may have led to bias creeping into the sample.
Table 2. The sample of responding and non-responding schools in Northern Ireland.
I disagree with how the OECD, Department for Education and Northern Irish government have interpreted the results from the bias analyses conducted.
At best, the evidence is inconclusive. At worst, it points towards schools with lower-achieving pupils being less likely to take part.
Such evidence is always open to interpretation, of course. And I appreciate others may take a different view.
What is inexcusable, however, is that such information was not made more explicit by the OECD – and as part of England’s and Northern Ireland’s national reports – in the first place.
In my view, the UK’s Office for Statistics Regulation should conduct an independent investigation into why this important information was not published.
 Before replacement response rates reported.
More questions about PISA 2018