Financial literacy is not just about maths: why PISA should rethink its test
By Blog Editor, IOE Digital, on 23 July 2014
One mis-selling scandal after another has highlighted how bad adults are at managing their own financial resources. They trust the wrong people and believe in the wrong advice. The ease with which payday lenders found customers shows the difficulty people have with APR (annual percentage rate) figures. Even taking out a current account is fraught with risk. For young people about to stack up a huge amount of debt at university, financial literacy is vital, if only to avoid their parents’ mistakes.
The UK Government sought to tackle the issue last year by announcing that financial literacy would become a compulsory part of England’s national curriculum – included both in mathematics and citizenship classes. This covers about half the state schools in the country – with academies and free schools allowed to opt out.
Unfortunately, new findings from the OECD suggest that there is virtually no correlation between the amount of time spent teaching financial literacy in classrooms and students’ ability to answer questions on the subject. Nor did it matter much whether financial literacy was taught by teachers of mathematics, economics/business or social sciences.
But could their analysis be misleading?
The OECD’s PISA programme administered a two-hour test to half a million students from 16 countries to try to assess young people’s financial literacy. These countries included the US and China, but not the UK.
The report, published this month, showed a very high correlation between PISA results for mathematics and those for financial literacy. Consequently China (Shanghai) came top by a large margin, while the USA performed below the 16-country average. Bottom came Colombia. According to the PISA findings the most important factor by far was how well the respondents had been taught mathematics.
But these results were not surprising, as the questions were strongly weighted towards numeracy (such as checking the accuracy of an invoice). An alternative explanation might be that this weighting made the correlation with maths results inevitable. If the questions had been focused on a critical approach to financial issues and products, the results may have been different.
The research looked at many variables that might determine student performance at financial literacy. It concluded that the amount of time spent teaching financial literacy had no effect. For example teachers in the United States spend a particularly long time on the topic – to no discernable benefit on test results. Furthermore the quantity of teacher training had little impact on the results. On average about half of all teachers of financial literacy had received some training on the subject – but to no avail. (The quality of the teacher training was not assessed.)
Other variables that proved unimportant included country GDP per capita, gender and social background. More important was direct experience: those with a bank account came out with higher scores than those without.
Specifically the PISA report says, in answer to the question: “What can be done to enhance financial literacy?”
- Having a bank account is a better predictor of financial literacy than anything you do in the classroom.
- Teaching financial literacy is a poor predictor of success at financial literacy in these tests
- Where financial literacy is taught as a separate subject (such as the USA) performance is not strong.
- Volume of exposure to financial education has no correlation with performance on the tests
From PISA’s viewpoint the findings are clear: teach mathematics in a more conceptual, abstract manner (as in Shanghai) and students will be in a better position to apply their understanding to real contexts. The question remains: is the extremely high correlation between maths and financial literacy simply a result of the construction of the tests? If so, policy makers should take great care about drawing conclusions from PISA’s exercise.
What was the purpose behind such a numerically-driven series of questions? There are many other factors in financial literacy, such as questions about past problems and scandals (Bernard Madoff; Charles Ponzi) plus the pyramid schemes that young adults can often be caught up in. There is the fundamental problem of asymmetric information, in which the seller of financial products knows so much more than the buyer. These issues were not addressed.
This leaves two possibilities: either teaching financial literacy is a waste of time (the PISA conclusion) or perhaps PISA did thorough research based on the wrong questions. In 2015 they will repeat the exercise (and may include Britain this time). Let us hope they think hard about whether they want to repeat the questions.