Jocelyn S Paulley
Partner
Co-leader of Retail & Leisure Sector (UK)
Co-leader of Data Protection and Cyber Security sector (UK)
Balados
15
Biometric data - like facial recognition and fingerprints - is being used more frequently, for example for access controls to buildings and identity verification. But with the UK GDPR and increasing regulatory scrutiny, businesses need to be careful about how they use it.
In this podcast, Co-Leaders of the UK Data Protection and Cyber Security practice Loretta Pugh and Jocelyn Paulley break down the legal risks of using biometric data and what businesses can do to stay compliant. They look at real-world examples, common issues, and useful tips on managing the legal risks.
Subscribe to 'Listen Up' on: Apple Podcasts | Spotify
BIOMETRICS PODCAST
START OF TRANSCRIPT
Speaker One: Loretta Pugh
Speaker Two: Jocelyn Paulley
Loretta Pugh: Hello, welcome to this podcast on the topic of biometrics. My name is Loretta Pugh and I co-lead the data protection and cyber security practice at Gowling WLG.
We are going to discuss the data protection implications of using so-called biometric data for identification purposes.
So, what do we mean by biometric data? Well biometric data is defined in the UK GDPR and it means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a person which allow or confirm the unique identification of that person. Examples can include the use of facial recognition and fingerprints.
Why did we pick this topic? Well, we are seeing as legal advisors, an increase in the number of our clients interested in the use of biometric data for identification purposes. Use cases include for access controls to buildings or certain areas or rooms within a building. We have also seen the use to verify identity for financial services clients due diligence purposes and also for identity verification to streamline payments.
Coupled with increased interest in the use of biometric data we have also seen increased regulatory activity including regulated guidance and enforcement.
There have been a number of recent cases that have highlighted some of the challenges that businesses face when seeking to use biometric data for identification purposes.
In this podcast we are going to discuss the challenges that can arise when processing biometric data for identification and the practical implications.
But first it is useful to give a high level description of how a biometric recognition system works.
For a biometric recognition system to be able to be used for identification purposes, the system needs to be able to compare the person in front of it with stored reference samples. These reference samples are known as biometric samples and they can include the image of someone's face in a photograph, the recording of someone's voice or a fingerprint.
Key information is extracted from the biometric sample to create a digital record of the sample which is known as a biometric template.
Biometric recognition systems work by comparing the relevant features of a person against one or more stored biometric templates and calculating the probability of the two matching. There will not be a 100% correlation between the two. Understanding the degree of potential error and the implications will be important in designing the use of any operation that involves biometric recognition.
I am now going to hand over to Jocelyn Paulley, who is the other co-leader of the data protection and cyber security practise to explain some of the key challenges when deploying a biometric recognition system.
Jocelyn Paulley: Thanks Loretta. So as Loretta said we are going to look at some of the key data protection compliance challenges when using biometric systems. We will periodically use as our example a facial recognition system for gaining access to a building.
So there is no better place to start than the concept under UK GDPR of data protection by design and that is because businesses must consider data protection issues right at the beginning of the design stage as well as throughout the lifetime of the system.
I really cannot stress enough the advantages of early consideration of the rules and obligations in data protection law. We have seen examples of businesses that consider these things just before their proposed solution is about to go live. In those cases it can be difficult to backtrack with the time and the cost involved to make the required changes and then the legal analysis and implementing those changes ultimately delays implementation of the system.
After data protection by design businesses have to consider how they will implement each of the data protection principles.
So firstly, a business has to identify a lawful basis for the processing of personal data. If that lawful basis is a legitimate interest then the business will need to consider whether it is necessary to use biometric data. Is there actually some alternative that could be used to provide the solution but in a less privacy intrusive way? It is very easy to be sold a technology solution as making a process easier for users or removing friction or enhancing auditability but that type of mere convenience use does not meet the data protection requirement for necessity. Each business would need to look at factors specific to its use case, its business, its sector and its particular challenges.
Separately businesses will also need to identify another condition for the processing of biometric data, a special category data.
In our experience, this is a particular challenge. That is because explicit consent is likely to be the most appropriate condition in most cases, but obviously with that comes ensuring that the explicit consent is valid. Part of that validity is being freely given and this is challenging in situations where there is an imbalance of power, for example an employer and employee relationship. That does not of course mean to say that biometric data for identification purposes can never be used in an employment context, but particular care would need to be taken when designing the use of the system and mitigations around it.
Secondly, personal data must be processed fairly and accurately. In the context of a biometric recognition system accuracy is going to be a key consideration both due to the inherent inaccuracy in the technology and potentially the impact on the individual of an inaccurate result. As Loretta stated earlier, a biometric recognition system will not be 100% accurate. The output of a comparison undertaken by the biometric system is based on a percentage probability of a match. When procuring the system it is important to undertake due diligence to ascertain the accuracy of the system and determine how any biases have been mitigated. For example early facial recognition systems trained primarily on the faces of white and caucasian people had poor accuracy when used with people from ethnic minority backgrounds.
A business will not be able to completely remove errors in recognition, but it is important as part of a compliance process that that level of removal and error is well understood and monitored. The potential frequency and impact of an error will determine the safeguards a business needs to implement. So if the system is configured so that the pass percentage threshold is high, that could lead to an increased likelihood of false negatives meaning that a person is rejected even though there should be a match.
On the other hand if a system is configured so that the pass percentage threshold is lowered that may lead to an increased likelihood of false positives meaning that then a person is accepted even though they are not the actual person in reality. So businesses will need to assess that risk and work out what mitigations apply and Loretta will talk more about that process later.
Businesses should also be aware that accuracy can decrease over time. For example a facial recognition system as people age their true likeness becomes less like the samples that were originally collected a number of years earlier. So those samples would need to be refreshed.
Moving on to think about the fairness and transparency principle. This essentially requires providing people with information about processing of their personal data, usually known as a privacy notice. In whatever form the information is provided, it should be easily accessible. You will need to think about that in the context of a facial recognition system that is used in the real world rather than online or in a paper based transaction, but where that privacy notice still has to be given, so for example thinking about appropriate signage in areas where facial recognition is in use.
Fourthly, people have certain rights under data protection law and that will apply to the processing of their biometric data. So businesses deploying biometric systems have to be cognizant of those rights and ensure there are processes in place to recognise and respond to requests which could include asking for copies of information, asking to be forgotten or objections to the use of their data.
And then finally, security is clearly a key consideration. Businesses must have the appropriate technical and organisational security measures required by GDPR in place when processing biometric data.
Businesses should ask questions in relation to security when deciding on a provider of a biometric recognition system as with any other system that processes significant quantities of data.
Businesses should ensure that data is not stored for longer than they need it. For example, if a person is no longer likely to access the building then their sample and template should be removed from the system.
I am now going to hand back to Loretta who is going to talk about data protection impact assessments.
Loretta: Thank you very much Joss.
Data protection impact assessments or DPIAs for short are required where the proposed processing is likely to result in high risk to peoples' rights or freedoms.
Data protection law also sets out some specific scenarios where a DPIA is required. These include where there is the processing of special category personal data on a large scale or where there is the systematic monitoring of a publicly accessible area.
Now at least one of these is likely to be relevant in the context of a biometric recognition system and the DPIA must be completed before the biometric recognition system is implemented.
So, how should we approach doing a DPIA? Well, the purpose behind the DPIA is to identify and consider risks associated with the proposed processing operation, consider mitigations for the identified risks and assess whether there are any residual risks that still represent a high risk to the rights and freedoms of the relevant individuals.
The DPIA must include certain information. It must contain a description of the envisaged processing operations and the purposes of the processing, an assessment of the necessity and proportionality of the processing operations, an assessment of the risks to the rights and freedoms of the relevant individuals, the measures envisaged to address those risks, the security measures and mechanisms to ensure the protection of personal data and information to demonstrate compliance with the UK data protection law, of course taking into account the rights and legitimate interests of the people concerned.
In the context of a biometric recognition system there are a number of obvious risks to consider. These include risks associated with a false positive or false negative matches, risk of discrimination and risk from personal data breaches.
So taking the first one of those examples first. We talked earlier about the possibility of false positive or false negative matches. Understanding the accuracy of the proposed system will be crucial in determining this risk. Once deployed, the system should be monitored on an ongoing basis.
Other processes could be put in place to mitigate against people experiencing adverse effects of false matches. For example, for false negative matches, putting in place an alternative process for ID verification.
The risk of discrimination may arise from bias in the biometric recognition and is linked to the false positive false negative accuracy issue. Again ongoing monitoring of the deployed system should be undertaken with a view to identifying if bias is occurring.
The risk from personal data breaches increases where appropriate technical and organisational security measures have not been implemented and as mentioned earlier due diligence on the proposed system and the provider of the system should be undertaken.
The UK GDPR requires that, where appropriate, the views of the relevant individuals on the intended processing should be sought. Businesses should engage with those that may be impacted, so for example, in relation to a building in which facial recognition technology is to be deployed, the tenants or other users of the building could be consulted.
If a DPIA indicates that the use of the biometric recognition system would still present a high risk to individuals after any risk mitigations have been implemented, the Information Commissioner's Office must be consulted prior to the system being used. Essentially the ICO is being asked whether the processing can nevertheless go ahead.
That has finished what we wanted to talk about today.
If you enjoyed this podcast be sure to check out our website at gowlingwlg.com for more useful insights and resources.
Thank you for listening, and we hope to speak to you again soon.
END OF TRANSCRIPT
CECI NE CONSTITUE PAS UN AVIS JURIDIQUE. L'information qui est présentée dans le site Web sous quelque forme que ce soit est fournie à titre informatif uniquement. Elle ne constitue pas un avis juridique et ne devrait pas être interprétée comme tel. Aucun utilisateur ne devrait prendre ou négliger de prendre des décisions en se fiant uniquement à ces renseignements, ni ignorer les conseils juridiques d'un professionnel ou tarder à consulter un professionnel sur la base de ce qu'il a lu dans ce site Web. Les professionnels de Gowling WLG seront heureux de discuter avec l'utilisateur des différentes options possibles concernant certaines questions juridiques précises.