X Close

IOE Blog

Home

Expert opinion from IOE, UCL's Faculty of Education and Society

Menu

Financial literacy is not just about maths: why PISA should rethink its test

By Blog Editor, IOE Digital, on 23 July 2014

Ian Marcouse
One mis-selling scandal after another has highlighted how bad adults are at managing their own financial resources. They trust the wrong people and believe in the wrong advice. The ease with which payday lenders found customers shows the difficulty people have with APR (annual percentage rate) figures. Even taking out a current account is fraught with risk. For young people about to stack up a huge amount of debt at university, financial literacy is vital, if only to avoid their parents’ mistakes.
The UK Government sought to tackle the issue last year by announcing that financial literacy would become a compulsory part of England’s national curriculum – included both in mathematics and citizenship classes. This covers about half the state schools in the country – with academies and free schools allowed to opt out.
Unfortunately, new findings from the OECD suggest that there is virtually no correlation between the amount of time spent teaching financial literacy in classrooms and students’ ability to answer questions on the subject. Nor did it matter much whether financial literacy was taught by teachers of mathematics, economics/business or social sciences.
But could their analysis be misleading?
The OECD’s PISA programme administered a two-hour test to half a million students from 16 countries to try to assess young people’s financial literacy. These countries included the US and China, but not the UK.
The report, published this month, showed a very high correlation between PISA results for mathematics and those for financial literacy. Consequently China (Shanghai) came top by a large margin, while the USA performed below the 16-country average. Bottom came Colombia. According to the PISA findings the most important factor by far was how well the respondents had been taught mathematics.
But these results were not surprising, as the questions were strongly weighted towards numeracy (such as checking the accuracy of an invoice). An alternative explanation might be that this weighting made the correlation with maths results inevitable. If the questions had been focused on a critical approach to financial issues and products, the results may have been different.
The research looked at many variables that might determine student performance at financial literacy. It concluded that the amount of time spent teaching financial literacy had no effect. For example teachers in the United States spend a particularly long time on the topic – to no discernable benefit on test results. Furthermore the quantity of teacher training had little impact on the results. On average about half of all teachers of financial literacy had received some training on the subject – but to no avail. (The quality of the teacher training was not assessed.)
Other variables that proved unimportant included country GDP per capita, gender and social background. More important was direct experience: those with a bank account came out with higher scores than those without.
Specifically the PISA report says, in answer to the question: “What can be done to enhance financial literacy?”

  • Having a bank account is a better predictor of financial literacy than anything you do in the classroom.
  • Teaching financial literacy is a poor predictor of success at financial literacy in these tests
  • Where financial literacy is taught as a separate subject (such as the USA) performance is not strong.
  • Volume of exposure to financial education has no correlation with performance on the tests

From PISA’s viewpoint the findings are clear: teach mathematics in a more conceptual, abstract manner (as in Shanghai) and students will be in a better position to apply their understanding to real contexts. The question remains: is the extremely high correlation between maths and financial literacy simply a result of the construction of the tests? If so, policy makers should take great care about drawing conclusions from PISA’s exercise.
What was the purpose behind such a numerically-driven series of questions? There are many other factors in financial literacy, such as questions about past problems and scandals (Bernard Madoff; Charles Ponzi) plus the pyramid schemes that young adults can often be caught up in. There is the fundamental problem of asymmetric information, in which the seller of financial products knows so much more than the buyer. These issues were not addressed.
This leaves two possibilities: either teaching financial literacy is a waste of time (the PISA conclusion) or perhaps PISA did thorough research based on the wrong questions. In 2015 they will repeat the exercise (and may include Britain this time). Let us hope they think hard about whether they want to repeat the questions.
 

Print Friendly, PDF & Email

One Response to “Financial literacy is not just about maths: why PISA should rethink its test”

  • 1
    Tim Mercer wrote on 13 August 2014:

    I see this issue as a parallel to just about everything in teaching which highlights the difference between a theoretical and a practical understanding. We learn at a deeper level when our learning is is focused on solving practical problems where we have some emotional investment. The irony here is that one of the most practical financial literate individuals is likely to be someone who has just come out of a debt counselling plan. Financial literacy is a lifetime’s learning that follows our needs and experiences.
    I spent over 20 years working in the City, I’ve decided to manage my pension myself in the last couple of years and I am learning so much more but at a deeper, more practical level.
    I completed my Mountain Leader assessment in July which involved an enormous amount of map and compass work for micro navigation, all of which could be taught in a classroom, none of which comes close to the deeper learning experience through the practical application on the hill.
    Financial literacy is a competency which is likely to gain more learning traction the closer it is to relevant experiences. E.g. Reading your payslip from your first job, saving nan and grandad’s birthday money.
    Young Enterprise is a great step for accounting because the students have skin in the game and, as simulations go, it is an excellent on-ramp to the real thing.
    Yes it is possible to use stock market games like Ifs Investor to simulate more complex scenarios but in reality that learning is a long way from practical application so is not likely to be of much long term value on the personal front added to the fact that these are more about trading than investment, so it’s akin to training students on how to read the form on horses – a different form of financial literacy!
    So I think PISA is looking in the wrong place. I feel these comparison tables are about comparing country achievement at an academic rather than a practical level. The more I teach the more I see the difference between academic and practical understanding, the more I feel that basic competencies fall into the latter category, should be delivered at the appropriate time where emotional investment is high and hence interest in learning is high, and are less appropriate for comparison tables because individuals will develop at their own pace in accordance with their needs and experience…I.e. The young 16 year old entrepreneur will probably demonstrate greater real life financial literacy than the 21 year old Financial Mathematics graduate but will probably underperformance in a PISA test.