We have identified four general principles that any organisation wanting to use facial recognition technology should consider following this judgment (as the context and grounds of appeal are to some extent specific in the context of use of facial recognition technology by the police force):
- Any use of facial recognition technology must undergo a Data Protection Impact Assessment which should assume that the right to privacy under the European Convention on Human Rights is engaged and likely to be infringed by the use of the technology.
- The benefits gained by use of facial recognition technology can outweigh an individual's right to privacy. This will clearly be context-specific but the use by the police in this instance was proportionate. This balancing exercise must be clearly documented (in a Data Protection Impact Assessment) by an organisation to show that the effect on individuals' right to privacy under the Human Rights Act has been considered and risks mitigated where possible.
- Clearly then facial recognition technology does interfere with an individual's right to privacy so any use must be "in accordance with the law". Whilst overall the use of the technology was proportionate, giving individuals (in this case, police officers) discretion as to when it is used and who the technology identifies is unlikely to be "in accordance with the law" unless there is very clear guidance.
- Consider whether the facial recognition technology will result in any discrimination or bias. In this case, specific public sector legislation dealing with equality was engaged. Whilst this will not be the case for private sector organisations, the courts are clearly alive to the potential for discrimination and expect users of these systems to satisfy themselves that there will be no bias.
The case concerned the use of a facial recognition system, called AFR Locate, by the South Wales Police ("SWP") to capture faces and match them to a "watchlist". The watchlist contains details of persons of interest to the police and vulnerable persons in need of protection.
AFR Locate uses a live camera feed to capture digital images of the public in real time. Software isolates individual faces and extracts unique features to create a biometric template. AFR Locate compares the template with a set of images in a digital database to generate a "similarity score" (numerical value) of faces of those on the watchlist. If there is no match, the software will automatically delete the biometric template. If there is a match, an officer will review the images to verify the match and determine whether to make an intervention.
The Appellant, Edward Bridges, brought a claim for judicial review, supported by Liberty. He was present in Cardiff on 21 December 2017 at Queen Street, Cardiff and 27 March 2018 at the Motorpoint Arena Defence Procurement, Research, Technology and Exportability Exhibition when SWP had deployed AFR Locate. He was not included on any watchlist in either deployment so his image was automatically deleted.
The High Court dismissed the Appellant's claim for judicial review but the Court of Appeal took a more nuanced approach in the appeal.
Grounds for appeal
- Whether the interference with the Appellant's rights under Article 8(1) European Convention on Human Rights ("ECHR") by SWP's deployment of AFR Locate was/is in accordance with the law for the purposes of Article 8(2) ECHR;
- Whether SWP's use of AFR constituted a proportionate interference and if the cumulative impact on all those whose facial biometrics were captured as part of the deployments were considered;
- Whether SWP's Data Protection Impact Assessment ("DPIA") was adequate under s64 Data Protection Act 2018 ("DPA 2018");
- Whether SWP had in place an "appropriate policy document" under s42 DPA 2018, which is a condition precedent for lawful and fair processing of personal data that constitutes "sensitive processing" under s35(8) DPA 2018; and
- Whether SWP's Equality Impact Assessment complied with its Public Sector Equality Duty under s149 Equality Act 2010.
The Article 8 ECHR claim
The Divisional Court found that the interference with Article 8(1) rights was justified for the maintenance of public order and prevention and detection of crime. With reference to (1) the provisions of the DPA 2018 (2) the Surveillance Camera Code of Practice and (3) SWP's own policy documents, it held that there was a clear and sufficient legal framework governing whether, when, and how AFR Locate may be used. This provided a level of certainty and foreseeability so such interference was "in accordance with the law" (Article 8(2) ECHR).
The Court of Appeal found fundamental deficiencies in the legal framework that did not 'afford adequate legal protection against arbitrariness and accordingly indicate with sufficient clarity the scope of discretion conferred on the competent authorities and the manner of its exercise'. The police officers were afforded too much discretion to decide who was placed on the watchlist and where AFR Locate was deployed so the legal framework did not have the necessary quality of law to satisfy the tenets of Article 8(2).
The Divisional Court considered whether the interference with Article 8(1) rights were proportionate. On the facts, AFR Locate was deployed in an "open and transparent way, with significant public engagement and was used for a limited time covering a limited area for a specific and limited purpose with some success. Neither occasion led to a disproportionate interference of Article 8(1) rights and any interference would have been limited to near instantaneous algorithmic processing and discarding of the Claimant's biometric data" . They concluded that a fair balance existed between the rights of the individual and the interests of the community.
The Court of Appeal found that use of AFR Locate was not lawful, so it was not necessary to consider this ground of appeal. Nevertheless, the Court considered whether the balancing test should take into account not only the actual results of an operation but its anticipated benefits and also the impact on all individuals whose biometric data was processed by the technology on the relevant occasions. It held that the Appellant had not detailed the impact on the wider public but such impact was negligible and could not become weightier simply because others were affected.
The data protection claims
Although none of the deployments took place after the commencement of the DPA 2018, the Divisional Court considered the legality of the deployments as if they had taken place after 25 May 2018.
The Divisional Court concluded that the processing of the biometric data of the public captured by the CCTV entailed "sensitive processing" under s35(8) DPA 2018 as it was "for the purpose of uniquely identifying an individual". Therefore, SWP had to meet the requirements of s35(5) DPA 2018.
The appeal was limited to whether SWP's Data Protection Impact Assessment ("DPIA") complied with s64 DPA 2018. The Appellant and Information Commissioner argued that it did not contain an assessment of privacy, personal data and safeguards, failed to acknowledge that AFR Locate collected data on a blanket and indiscriminate basis and there was a risk of longer retention periods due to false positive results. The DPIA also failed to assess the risks and mitigation required or to address the potential for discrimination.
The Court of Appeal held that some of these criticisms were unjustified. The DPIA acknowledged that AFR Locate did interfere with Article 8(1) ECHR rights but the Divisional Court had erred in its conclusion that the interference was in accordance with the law. Therefore, SWP had failed to properly "assess the rights and freedoms of data subjects or to address the measures envisaged to address the risks arising from the deficiencies as required by s64(3)(b) and (c) DPA 2018" .
In its assessment of whether the condition of s35(5) DPA 2018 had been satisfied, the Divisional Court had only to consider whether SWP had an appropriate policy document in place for the purposes of s35(5)(c) DPA 2018, having decided the first two conditions earlier in its judgment. The Court held that it was open to question whether the document met the standard required by s42(2) DPA 2018 and said that ideally it should be more detailed but made no final judgment on this point, as it was for the Information Commissioner to provide specific guidance as to its content.
The Information Commissioner expressed the view that the policy document was sufficient, albeit barely, and SWP had revised the policy in light of guidance published on 4 November 2019. The Court of Appeal rejected this ground on that basis.
The Public-Sector Equality Duty claim
The Divisional Court held that SWP had not failed in its duty to have due regard to the need to eliminate discrimination under s149 of the Equality Act 2010. There was no suggestion when the AFR Locate trial commenced, or that SWP either recognised or ought to have recognised that the software it had licensed might operate in a way that was indirectly discriminatory .
The Court of Appeal disagreed. The duty requires the taking of reasonable steps by the public authority to make enquiries about the potential impact of a proposed decision or policy on people with relevant characteristics. Although no evidence was submitted that the software used was discriminatory, SWP failed to satisfy themselves, either directly or by way of independent verification, that the software did not have an unacceptable bias on grounds of race or sex .
The Court went on to say that it hoped all police forces considering using such technology would wish to "satisfy themselves that everything reasonable which could be done had been done in order to make sure that this or similar software does not have a racial or gender bias" .
 R (Bridges) v Chief Constable of the South Wales Police  EWHC 2341 (Admin)
 Article 8 (1) "Everyone has the right to respect for his private and family life, his home and his correspondence."
 Article 8 (2) "There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic wellbeing of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others."
 Rice v Connolly  2 All ER 649