Biometrics in schools – Big Brother technology or an opportunity for human flourishing?
By Blog Editor, IOE Digital, on 11 October 2021
Twenty-first Century schools can be complex places to manage and attend. Schools have grown in size substantially over the last couple of generations, and now need complicated systems of control and regulation. The Biometrics Institute Congress is going to be discussing biometrics, artificial intelligence and privacy on 13 October, and for the first time this will include their impact on education.
One of the primary issues for governing bodies and local education authorities is reconciling a need for bureaucratic efficiency whilst acting in loco parentis – ensuring that the children in their care are where they should be, engaged in appropriate activities at the right time, and being fed at appropriate intervals. Developers have sought to support schools (and monetise solutions to any number of management problems) through digital products, for example for attendance monitoring, assessment, accounting and auditing.
It is within this commercial framework that we find biometrics proliferating, and with it, associated privacy concerns.
Biometrics – the use of individual biological markers, such as fingerprints and iris scans – began to be adopted by mainstream schools around fifteen years ago, mainly in countries such as the US and UK, the Netherlands, Belgium and France. They were used in products aimed at monitoring expenditure on school meals, as well as access to library books, or occasionally to control access to buildings.
Early adoption figures for these functions are hard to come by, but it is estimated that by 2014, at least 40% of the UK school population had been fingerprinted for such purposes, or registered for palm vein readers or facial recognition systems.
The use of biometrics in schools has always met with controversy. Some countries and jurisdictions banned it early on, for example US states such as Arizona, Illinois, Iowa, Maryland, Michigan and Florida, as well as countries such as Germany. In other countries and regions there were protests. As early as 2005, the French ‘Group Against Biometrics’ went so far as to smash palm readers in schools.
Privacy concerns are generally two-fold. Basic systems used for school meals, library book loans and room access engender concerns about the misuse of personal data, the likelihood of mission creep (data collected for one purpose being used for another), and the dangers of databases being illicitly cross-referenced to identify individuals. This is technically possible, but difficult, as so few data points are collected for fingerprint or facial recognition purposes (although this has increased in recent years). The limited data points also help explain why biometric systems prove so unreliable in schools. For example, meal credits and debits are frequently applied to the wrong transactions, because schools are poorly educated about the importance of setting the system’s False Accept/Reject Rate properly to ensure fairness. This means many children’s accounts end up being confused, so that, for example, the same 10-20 children’s school meals accounts are regularly mixed up (much to their chagrin).
There are also privacy concerns about higher stakes applications, where a failure of a biometric system can have serious personal consequences, usually of the disciplinary type. This has been greatly amplified through the introduction of behavioural biometrics to educational settings in more recent years. An example of this is virtual proctor software for the remote monitoring of pupils sitting exams at home, which came to attention during the COVID-19 pandemic. These systems track the most minute eye movements, amongst other things, and use a proprietary algorithm to diagnose ‘cheating’. These systems, with their opaque analysis, trained on a limited population, are routinely perceived as oppressive and unfair by many examinees.
Another example of high-stakes biometrics is the introduction of ‘emotional’ biometrics for behaviour management in the classroom, usually in an experimental capacity, as in the case of a system tested in a Chinese middle school by Hikvision Digital Technology. This system was designed to assess whether pupils were paying attention in class through assessing whether their facial expressions were happy, sad, angry, surprised or neutral. It brought resistance from parents and the system was quickly withdrawn. In the academic literature, the introduction of such technologies focused around the audit of the physical body has been described by Swauger (2020, Chapter 6) as representing a ‘punitive pedagogy’, with power and control being at the centre of the product design, rather than, say, the growth of knowledge, or human flourishing.
The biometrics industry now stands at something of a crossroads. It can continue to develop and test products on what is a captive population in schools, with minimal attention paid to the social consequences of their long-term use. Alternatively, it can decide to involve stakeholders much more closely in the development of products, through collaborative development approaches that involve significantly less commercial secrecy, so proper scrutiny can take place, and products can be revised and adapted as appropriate. By stakeholders, this should mean pupils and students, teachers and parents, rather than finance departments or senior management teams who might be involved in high-level procurement. There also need to be more extensive training populations, in order to take into account diverse cultural and racial backgrounds, as well as any special educational needs. This provides for an ethical approach to technological tools which are having increasingly profound social consequences.