House of Commons committee recommends national pause on the use of facial recognition technology

7 minute read
31 October 2022

Is your use of facial recognition technology in line with the evolving legal landscape? While official regulation of these technologies are scarce – new guidance and recent developments are telling of where the law is likely heading – and you will want to be on trend. It is no secret that the use of facial recognition technologies are under regulated and there is uncertainty with regard to the extent to which these technologies can be utilized without infringing on the privacy rights of individuals. However, recent developments and suggestions to Parliament by the House of Commons Standing Committee on Access to Information, Privacy and Ethics ("the Committee") provide some guidance.



In October 2022, the Committee issued 19 recommendations to Parliament in a report (the "Report") addressing facial recognition technologies ("FRT"). The recommendations are an attempt to persuade legislators that the law should evolve to become equipped to handle the unique challenges that come with the widespread use of FRT. The Committee suggested that the government impose a national pause on the use of FRT until there is an appropriate legal framework in place.[1]

The Report recommends that the legal framework for FRT be bolstered through amendments to the Privacy Act and the Canadian Human Rights Act.[2] Additional recommendations include the implementation of an opt-in-only requirement for collection of biometric information by the private sector, and strengthening the ability of the Privacy Commissioner under the Personal Information Protection and Electronic Documents Act ("PIPEDA") to levy meaningful penalties on governments and private entities whose use of FRT violate Canadian privacy law.

In late 2021, FRT were the subject of a high-profile joint investigation by the Office of the Privacy Commissioner of Canada ("OPC") and its provincial counterparts in Alberta, British Columbia and Quebec. The case involved the investigation of an FRT software company and the RCMP. The latter utilized the company's data to conduct hundreds of biometric searches across Canada.[3] The investigation resulted in findings of violations of the PIPEDA and the Privacy Act. There were more than three billion images of faces gathered from the internet without users' consent.[4] The investigation concluded that such an action represented mass surveillance and deduced that there were violations of Section 4 of Canada's Privacy Act.

It is startling to think how prevalent FRT are and how little guidance exist to regulate their use. The average individual interacts with FRT multiple times a day to unlock their smartphones, access banking apps, and to apply social media filters. Beyond the more mundane use of this technology, FRT has far-reaching impacts and implications due to the ubiquity of cameras. FRT has also been extremely useful in the delivery of critical services. Transportation companies use FRT to reduce traffic congestion, the healthcare field utilizes the technology to diagnose and monitor patients, and the government border services to ensure travel security. The list keeps growing. However, the value of these technologies are greatly diminished, and potentially counter-productive, without organizations being able to regulate their use under comprehensible and transparent mandates. So how can companies rely on these technologies to deliver key services within the parameters of acceptable use?

First, use the decisions from the PCO and the Committee's Report as guidance for your policies and to determine what is likely acceptable. How are you using the data, and is it for the purpose for which consent was obtained? It is likely prudent to have an opt-in policy, especially for sensitive biometric data. For further guidance, industries that intend to rely on FRT can also look to examples in the United States.

FRT in the United States has been more widely employed and legislated than in Canada. In Illinois, for example, the laws require healthcare collectors of biometric identifiers, such as facial features, to provide notice, obtain consent and to maintain a retention schedule.[5] It would not hurt to consider similar policies, especially for highly sensitive biometric data.

Finally, keep an eye out for new developments and constantly consider whether your use of FRT is on trend with the evolution of the law. It does not hurt to get ahead of the trend – but it may cost to lag.


NOT LEGAL ADVICE. Information made available on this website in any form is for information purposes only. It is not, and should not be taken as, legal advice. You should not rely on, or take or fail to take any action based upon this information. Never disregard professional legal advice or delay in seeking legal advice because of something you have read on this website. Gowling WLG professionals will be pleased to discuss resolutions to specific legal concerns you may have.

Related   Tech