X Close

IOE Blog

Home

Expert opinion from IOE, UCL's Faculty of Education and Society

Menu

Has peak PISA passed? A look at the attention international assessments receive

By Blog Editor, IOE Digital, on 23 February 2023

23 February 2023

By John Jerrim

Once upon a time, when Michael Gove was Secretary of State for education, PISA was all the rage (for the uninitiated, PISA is the Programme for International Student Assessment, which compares the performance of 15-year-olds across nearly 100 countries in reading, mathematics and science). As I noted at the time, international evidence was then en vogue, with PISA in particular featuring prominently in education debates. But is PISA now receiving less attention than it use to? In a new academic paper, I take a look… (more…)

PISA: England’s schools segregate by ability more than almost every other country in the world

By Blog Editor, IOE Digital, on 24 September 2019

John Jerrim.

In education systems across the world, children are separated into different groups based upon their academic achievement. This is done in different ways.

Countries such as Germany, the Netherlands and Northern Ireland ‘track’ pupils of high and low achievement into different schools (as do parts of England – Kent, for instance – that have retained grammar schools).

Others rely more heavily upon within-school ability grouping of pupils, whether this be setting/streaming, or sitting higher/lower achieving children together within the same class.

A whole host of research has compared countries in how much they segregate higher and lower achieving pupils into different schools. But there has been little work on the extent that different countries group high and low achievers together when they go to the same school.

(more…)

How similar are the PISA and TIMSS studies?

By Blog Editor, IOE Digital, on 4 December 2017

Christina Swensson
This is the fifth in a series of blogs that delve below the headline findings from the 2015 Trends in International Mathematics and Science Study (TIMSS). This blog investigates the similarities between TIMSS and the Programme for International Student Assessment (PISA), another large-scale study designed to assess pupil achievement across a number of countries. So how do the headline findings from the two studies compare?
PISA and TIMSS Cycles
TIMSS, administered by the IEA, has been carried out every four years since 1995, a total of six study cycles. The OECD started its own large-scale international survey in 2000 and has been running the Programme for International Student Assessment (PISA) every three years since then, also a total of six study cycles. The two studies do not normally coincide (more…)

What does the TIMSS 2015 international encyclopedia tell us about how our curriculum and assessment compare with other countries'?

By Blog Editor, IOE Digital, on 1 December 2017

Tina Isaacs and Christina Swensson.
This is the fourth in a series of blogs that delve below the headline findings from the 2015 Trends in International Mathematics and Science Study (TIMSS).
This blog focuses on what TIMSS can tell us about other countries’ curriculum and assessment systems. It compares information about England, which appeared in the top 10 of three of the four TIMSS assessments areas in 2015, with that of six other high performing jurisdictions – Hong Kong, Japan, South Korea, Russia, Singapore and Taiwan. All of these comparator countries featured in the top 10 across all (more…)

TIMSS 2015: do teachers and leaders in England face greater challenges than their international peers?

By Blog Editor, IOE Digital, on 8 November 2017

Toby Greany and Christina Swensson. 
This is the third in a series of blogs that delve below the headline findings from the 2015 Trends in International Maths and Science Study (TIMSS)[1]. In this blog, we focus on how the perceptions of teachers and school leaders in England compare with those of their peers in other countries.
Just under 300 English primary and secondary schools took part in TIMSS 2015. The headteachers of these schools, as well as the mathematics and science teachers of randomly selected Year 5 and year 9 classes, were asked to complete a background questionnaire asking their views on a range of issues. Given the way teachers were selected to participate in TIMSS, their responses do not present a representative view of all teachers and headteachers in England. Therefore, we compare the findings from TIMSS with findings from (more…)

England’s performance in TIMSS 2015: a 20 year story of improvement?

By Blog Editor, IOE Digital, on 29 November 2016

Toby Greany
The Trends in International Mathematics and Science Survey (TIMSS) now provides 20 years-worth of internationally comparable data on the mathematics and science performance of primary and secondary pupils worldwide, and the contexts in which they learn. England has participated in the study, which is now in its sixth four-yearly cycle, since its inception in 1995.  The 2015 national report, which I and a team from the UCL Institute of Education authored for the Department of Education, can be found here: TIMSS 2015. (more…)

People having a pop at PISA should give it a break…

By Blog Editor, IOE Digital, on 30 July 2013

John Jerrim

For those who don’t know, the Programme for International Student Assessment (PISA) is a major cross-national study of 15 year olds’ academic abilities. It covers three domains (reading, maths and science), and since 2000 has been conducted tri-annually by the OECD. This study is widely respected, and highly cited, by some of the world’s leading figures – including our own Secretary of State for Education Michael Gove.
Unfortunately not everyone agrees that PISA is such an authoritative assessment. Over the last month it has come in for serious criticism from academics, including Svend Kreiner (PDF) and Hugh Morrison (PDF). These interesting and important studies have been followed by a number of media articles criticising PISA  – including a detailed analysis in the Times Educational Supplement last week.
As someone who has written about (PDF) some of the difficulties with PISA  I have read these studies (and subsequent media coverage) with interest. A number of valid points have been raised, and point to various ways in which PISA may be improved (the need for PISA to become a panel dataset – following children throughout school – raised by Harvey Goldstein is a particularly important point). Yet I have also been frustrated to see PISA being described as “useless”.
This is a gross exaggeration. No data or test is perfect, particularly when it is tackling a notoriously difficult task such as cross-country comparisons, and that includes PISA. But to suggest it cannot tell us anything important or useful is very far wide of the mark. For instance, if one were to believe that PISA did not tell us anything about children’s academic ability, then it should not correlate very highly with our own national test measures. But this is not the case. Figure 1 illustrates the strong (r = 0.83) correlation between children’s PISA maths test scores and performance in England’s old Key Stage 3 national exams. This illustrates that PISA scores are in-fact strongly associated with England’s own measures of pupils’ academic achievement.
Figure 1. The correlation between PISA maths and Key Stage 3 maths test scores
pisa1
Source: https://www.education.gov.uk/publications/eOrderingDownload/RR771.pdf page 100
To take another example, does the recent criticism of PISA mean we actually don’t know how the educational achievement of school children in England compares to other countries? Almost certainly not. To demonstrate this, it is very useful to draw upon another major international study of secondary school pupils’ academic achievement, TIMSS. This has different strengths and weaknesses relative to PISA, though at least partially overcomes some of the recent criticisms, with the key point being – does it tell us the same broad story about England’s relative position?
The answer to this question is yes – and this is shown in Figure 2.  PISA 2009 maths test scores are plotted along the horizontal axis and TIMSS 2011 maths test scores along the vertical axis. I have fitted a regression line to illustrate the extent to which the two surveys agree over the cross-national ranking of countries. Again, the correlation is very strong (r = 0.88). England is hidden somewhat under a cloud of points, but is highlighted using a red circle. Whichever study we use to look at England’s position relative to other countries, the central message is clear. We are clearly way behind a number of high performing East Asian nations (the likes of Japan, Korea and Hong Kong) but are quite some way ahead of a number of low and middle income countries (for example Turkey, Chile, Romania). Our exact position in the rankings may fluctuate a little (due to sampling variation, differences in precise skills tested and sample design) but the overall message is that we are doing okay, but there are other countries that are doing a lot better.
Figure 2. The correlation between PISA 2009 and TIMSS 2011 Maths test scores
pisa2
Source: Appendix 3 of http://johnjerrim.files.wordpress.com/2013/07/main_body_jpe_resubmit_final.pdf
I think what needs to be realised is that drawing international comparisons is intrinsically difficult. PISA is not perfect, as I have pointed out in the past, but it does still contain useful and insightful information. Indeed, there are a number of other areas – ‘social’ (income) mobility being one – where cross-national comparisons are on a much less solid foundation. Perhaps we in the education community should be a little more grateful for the high quality data that we have rather than focusing on the negatives all the time, while of course looking for further ways it can be improved.
For details on my work using PISA, see http://johnjerrim.com/papers/

Before we compare mathematics, reading or science, here's some geography

By Blog Editor, IOE Digital, on 13 December 2012

 Chris Husbands
Two major studies of international attainment in education have been published: the four-yearly Trends in International Mathematics and Science Study (TIMSS) and the five-yearly Progress in International Reading Literacy Study (PIRLS). Both have been extensively reported and tell quite different things. On international comparisons, says the BBC, England “has slipped in science, but is top 10 for primary and secondary maths”.  The Daily Telegraph account, picking up on the science story, noted that England had fallen behind Slovakia and Hungary, whilst the BBC, looking at mathematics performance, declared that English pupils were amongst the best in the world.
Looking beyond the headlines, Chris Cook in the Financial Times  extracted data on the performance of Chinese heritage students in England from GCSE and, with the statistical wizardy of which he is capable, showed that such students in England perform almost as strongly as the students in high performing Pacific Rim systems.
International rankings of educational performance are now a routine feature of the education news cycle. The OECD PISA rankings come around every three years, on a different cycle from PIRLS and TIMSS.  My calculation is that edu-statistics geeks are going to have to wait until 2052 for the three to coincide – which makes 2052 like one of those rare astronomical events when several planets line up.
Of course, PISA,  PIRLS and TIMSS measure slightly different things at different ages – so the discrepancies between their methods, foci and findings prompt endless debate about their significance. The indefatigable North American blogger Yong Zhao has explored the complexities of what the figures show and suggested that the surface interpretation of PISA, TIMSS and PIRLS is almost certainly misleading. Each, moreover, is taken by a changing number of countries, so that arguments about whether performance rising or falling are endless.
But there is another, intriguing debate to be had about PISA, PIRLS and TIMSS. In this post  so far I have written about national performance. But all three studies draw on samples, and the sampling frame is different. So PISA compares, for example, Korea, which is a country,  with Hong Kong and Macau, which are special administrative regions, with Ontario, which is a  province in Canada. TIMSS includes the results for Massachusetts and Florida, which are not countries or provinces but States of the Union. This is a puzzle for comparison. We know, for example, in England, that schools in London out-perform schools nationally – though they have not always done so – and one report (again in the Daily Telegraph) suggested that the Department for Education in England is considering entering each English region separately in PISA 2015. But English regions are neither provinces, nor states, nor administrative regions, nor countries:  they are geographical agglomerations of local authorities.
In most descriptions of  international performance, the comparisons are said to be not between countries but between “jurisdictions”.  But educational jurisdictions are rarely defined, and sloppy comparisons follow between quite different entities. An educational jurisdiction must be a unit which shares some characteristics: for starters, common curriculum and assessment standards, teacher recruitment, development and remuneration policies and overall system development policies. It would make little sense to break a jurisdiction (for example, England) which shares these characteristics into smaller units – although that is exactly what PISA does in publishing results for Shanghai rather than for China.
It makes sense to treat Hong Kong separately from Shanghai: they are very different education systems with different histories, linguistic traditions and educational structures. It probably makes sense to deal with Ontario differently from Alberta in PISA: in Canada education is strong a provincial responsibility. American states are a different matter, however: as increasing federal funds are allocated to States dependent on conformity with US-wide educational requirements, the responsibilities are shifting. But even national comparisons are fraught with difficulty: is it reasonable to compare New Zealand (population of 4 million) with Japan (population of 120 million)? The conventional argument is that Asian jurisdictions – with often highly centralised education systems in cultures which place a high premium on education – do exceptionally well in mathematics assessment. A cheekier reading of PISA might suggest that smaller jurisdictions do better than larger jurisdictions.
There is a vast amount to be learnt from international comparisons of educational performance, and many countries have reported “PISA shock” when results suggest hitherto unexpected weaknesses. But it’s worth remembering that the unit of comparison is often complex. International comparison can be full of pitfalls – and one of these is “what is a jurisdiction”?