X Close

IOE Blog

Home

Expert opinion from IOE, UCL's Faculty of Education and Society

Menu

How shift to computer-based tests could shake up PISA education rankings

By Blog Editor, IOE Digital, on 19 February 2016

John Jerrim.

The world’s most important examination is moving online. Since the Organisation for Economic Cooperation and Development launched the Programme for International Student Assessment (PISA) in 2000, it has provided an influential and timely update every three years of how 15-year-old school children’s mathematics, science and reading skills compare across the globe.
Poor performance has “shocked” a number of national governments into action, and they have embarked on a range of extensive reforms to their school systems.
Whereas each of the five cycles of tests completed between 2000 and 2012 were completed on paper, 58 of the 72 economies who participated in PISA 2015 between November and December last year administered the PISA test using computers – including the UK.
My new research starts to show that this shift is likely to influence the results of PISA 2015, which are due to be published towards the end of this year.
I drew upon data from 32 countries that completed both a paper and a computer mathematics test as part of PISA 2012 – the last round of this important global assessment.
There are some striking results. Average PISA paper and computer scores differ by more than ten test points in around a third of countries. The OECD has previously suggested that differences of such a magnitude are substantial.

Shifts in results

Shanghai is a particularly striking example, where average PISA scores under computer assessment fell by 50 PISA test points – equivalent to more than an entire year of schooling. In contrast, young people in the US saw their performance improve significantly – by, on average, 17 test points.
jerrim
Source: PISA 2012
There are also differences between sub-groups of students who took the test. Somewhat against my expectations, the difference in test scores between rich and poorer pupils was actually smaller by an average of around five PISA test points across countries when the PISA test was taken on computer, compared to paper.
Meanwhile, in Sweden and Russia, boys were suddenly better at mathematics than girls on the computer-based test – despite no gender differences being observed in these countries on the paper-based version of the test.

Why computer tests change things

There are several possible explanations for these stark differences in results. Previous research has suggested that performing tasks on paper and computer require different cognitive processes.
Important test-taking strategies, such as leaving the most challenging questions to one side to tackle at the end, are no longer possible. Students cannot move onto the next part of a question, or the next question, until they have finished the previous one. Computer tests can also be more engaging, particularly when answering the question correctly involves the use of on-screen interactive tools. On the flip side, we have all felt the frustration of working on slow operating systems and of our computer crashing.
The findings highlight some issues that will be important when it comes to interpreting the results of the forthcoming PISA 2015. Will we be able to compare results to previous cycles in order to monitor trends over time? Will results between countries taking paper and computer versions of the 2015 PISA test be comparable? Should we expect to see a fall in differences in PISA test scores between socio-economic groups? And what will this mean for international comparisons of differences in educational achievement between boys and girls?
Although we must wait to find out the answer to these questions, it is nevertheless clear that when children take paper and computer versions of similar tests, it can lead to quite notable differences in the results.The Conversation
This article was originally published on The Conversation. Read the original article.

Print Friendly, PDF & Email

Comments are closed.