The Problem with Facial Recognition in Schools

Facial recognition poses risks and unquantifiable damage to students, staff and other data subjects when used in schools. Given that the technology is new, unregulated and together with apps, the internet and other technology, means that when the digital map of a face is lost or published publicly, the individual it relates to can be tracked in many different ways. When considered against data protection law, this impact is called material and non-material damage. This leaves the data controller (and in some cases processor) open to financial damages from the data subjects as well as action from the data protection regulator.

But why is facial recognition technology a threat? Everyone uses it on their phones.

There’s a distinct difference between how the technology works on your phone, and your own choice to register your own face, against the technology used by companies that include a facial recognition feature within their product / service (such as cashless catering, access control etc). When a company integrates facial recognition features within their product or service, they need hardware and software to make it work. The hardware is often bought off the shelf and the software, bought from a business that creates the code that makes it possible to take an image of a person, create a map of the face, then apply an algorithm that breaks the relationship between the actual image, creating what is known as a ‘hash’ of the identifier. When marketing or selling ‘facial recognition’ product features, it is often quoted that the hash has no correlation back to the image - which is strictly true. It is also quoted that the hash is encrypted, therefore impossible to read by anything other than the facial recognition software - which is less true - as history has shown time and time again, that encryption can eventually be decrypted.

 

Technical risks of facial recognition technology

Using cashless catering as an example, there’s a camera to take images and check identity, there’s software used to make the camera work and take an image, once the image has been taken it’s handed to another bit of software that makes the facial map, this is then handed to another piece of software that turns the facial map into an algorithm, then another piece of software encrypts the image. It is very rare for each of these pieces of software to be developed by the same company and to the same standards. Therefore in the process of creating the facial map, there are significant ‘air gaps’ between the software and the software development standards. These gaps create security vulnerabilities, security vulnerabilities attract ethical and nasty cyber attackers (including unfriendly government security services), sensitive personal data has value - monetary value and other value for nefarious activities.

 

With this in mind, any company that is offering facial recognition as a feature for their product or service can only provide the necessary evidence to a data controller such as a school, if they can demonstrate the process from the capture of the image, the integrity of the software code, and the risks posed by each of the ‘air gaps’ between the different components of the technology. Given the supply chain and different organisations that are involved, it is highly unlikely that any company, other than those who are focussed on national security, would be able to provide the evidence that would be necessary to evaluate the security risks associated with facial recognition systems.

 

We’ve got 1000 students to feed and only 25 mins to do so, we’re installing facial recognition.

That’s not really advisable! In recent news in the UK, a number of schools chose to implement facial recognition for this very reason. Yet, it appears they didn’t consider a number of fundamental steps of privacy related law. There are many components to this, so we’ll focus on just a few. The first is the lawful basis. In the example of these schools, the lawful basis chosen was consent, and because facial recognition is classed as biometric data, the schools required explicit consent. The problem here is the imbalance of power between the school and the data subjects (both parents and students). Schools make decisions on everything from uniforms to haircuts, and penalties for not following the school policy. Given this, there is a very valid argument to suggest many data subjects would feel ‘jumped’ or pressured to give consent in order to get fed. The schools in this example have basically published to the world ‘we need your face as otherwise you won’t get fed within the allotted time’. This cannot really be classed as true consent.

Second is the evaluation of the potential impact of the processing on the rights and freedoms of the data subject. In choosing new technology, there is a requirement in many countries to evaluate the impact of processing through a Data Protection Impact Assessment (DPIA) or Privacy Impact Assessment (PIA). Many schools use 9ine’s App technology to ensure they cover off all the necessary considerations. The DPIA / PIA requires questions to be answered relating to the benefits of the processing for the data subjects and the organisation, risks to information rights, security risks and various other considerations to enable the evaluator to objectively determine whether the processing is appropriate. A critical question to answer in this example of cashless catering is whether the processing can take place in a way that is less intrusive for the data subjects. These could be things like adding another till, extending the lunch serving time by 5 minutes (20% more capacity for serving) or employing more staff. The expectation is that an organisation can evidence the thought process when there is a complaint related to the processing risks - having a standard, risk based approach in the form of software that enables you to document your processes is one method of managing this.

Facial recognition, whilst it has come a long way, has a track record of being discriminatory to certain tones of skin and facial features - as in it does not recognise or can be tricked. Environmental characteristics of lighting and the positioning of the camera need to have been considered to limit the risks of these issues occurring. Software updates create risks where future updates generate potential discriminatory scenarios given the nature of software, updates will cause glitches. As a school thinking of using facial recognition, these limits on the potential of the technology need to be at the forefront of any decision on its actual use.

When incorporating the technology risks associated with facial recognition with the implications of consent as the lawful basis and the balancing tests evidenced through the DPIA, it is very difficult to see how you can come up with an answer that facial recognition is appropriate.

 

So how are schools coming to the conclusion that facial recognition is appropriate?

There are no real right or wrong answers when it comes to privacy law. There is however the expectation that a process is followed to determine the risks that are created through the processing of personal data. To follow a process, people need to be trained and the organisation needs expertise, capability and capacity to know what it should and shouldn’t be doing. Whether it be schools suffering cyber attacks, mis-handling information rights requests, or undertaking processing that shouldn’t be happening, the common trend is a lack of training or expertise. Privacy technology such as that delivered and created by 9ine, and training programmes, such as those created by Educare, go a long way to building awareness, demonstrating accountability and cost-effectively managing privacy compliance. The silver bullet though, to effectively tackling these issues is board awareness and support. If a school’s board or governing body does not have someone accountable for data privacy and protection, the school is unlikely to provide the correct resources and support to overcome challenges like facial recognition.

 

Learn more about the implications of using biometric data in schools through attending 9ine’s Education Privacy Webinar on 11/11/21

Register Now

Let’s Stay in Touch

Subscribe to our newsletter to receive product announcements & other updates.

footer-illustration