Caitlin Schropp
Associate
Article
12
Recognizing the many sources of uncertainty in the Artificial Intelligence and Data Act ("AIDA"), Innovation, Science and Economic Development Canada recently offered important insight and clarity into the Government's intended approach to artificial intelligence (AI) regulation. Much of the uncertainty to date is attributable to the fact that the specific AI systems subject to the AIDA, along with the measures the legislation will impose, are to be defined through regulation at a later point in time.
In light of those unknowns, a companion document published on March 13, 2023 aims to "reassure actors in the AI ecosystem in Canada that the aim of the AIDA is not to entrap good faith actors or to chill innovation, but to regulate the most powerful uses of this technology that pose the risk of harm."
Building on this pledge, the document makes clear that "the Government intends to take an agile approach that will not stifle responsible innovation or needlessly single out AI developers, researchers, investors or entrepreneurs." It offers several assurances in support of these objectives:
The companion document also provides further guidance on the systems intended to be the primary target of regulation, and sample measures that may be imposed on various entities involved in the development of AI systems.
Below, we review the companion document's key points of clarification in detail.
The AIDA is intended to serve as gap-filling legislation to ensure AI-specific risks will not fall through the cracks of existing consumer protection and human rights legislation. The protection of Canadians – particularly vulnerable groups like children, or historically marginalized groups – from collective harms by mitigating the risk of systemic bias in AI systems has been identified as a primary purpose of the legislation.
The Government describes the AIDA as a first-step framework for a new regulatory system, and has expressed an intention to build upon this framework through an open and transparent regulatory development process in consultation with stakeholders. Implementation of the initial set of AIDA regulations is expected to take the following path, after Bill C-27 receives Royal Assent:
The Government identified the following as examples of systems that are of interest in terms of their potential impacts:
Further, the Government expressed the following to be among the key factors that persons responsible for AI systems must assess in determining whether an AI system is high impact:
The regulated activities laid out in the AIDA would each be associated with distinct obligations tailored to the context and risks associated with specific regulated activities in the lifecycle of a high-impact AI system.
The specific measures required by regulation would be developed through extensive consultation and would be based on international standards and best practices. The guiding principles to be used in prescribing such measures are:
Prescribed monitoring obligations would be proportionate to the level of influence that an actor has on the risk associated with the system. For example, as end users of general-purpose systems have limited influence over how such systems function, developers of general-purpose systems would be the ones responsible to ensure that risks related to bias or harmful content are documented and addressed.
Similarly, businesses involved only in the design or development of a high-impact AI system, but with no practical ability to monitor the system after the development, would have different obligations from those managing its operations. Individual employees would not be expected to be responsible for obligations associated with the business as a whole.
In the initial years after it comes into force, the focus of AIDA enforcement would be on educating different stakeholders, establishing guidelines and helping businesses to come into compliance through voluntary means. The Government intends to allow ample time for the ecosystem to adjust to the new framework before enforcement actions are undertaken.
Further, smaller firms would not be expected to have governance structures, policies, and procedures comparable to those of larger firms with a greater number of employees and a wider range of activities. Small- and medium-sized businesses would also receive particular assistance in adopting the practices needed to meet the requirements.
The Government would also mobilize external expertise in the private sector, academia and civil society to ensure that enforcement activities are conducted appropriately in the context of a rapidly developing environment.
AMPs would be designed in a manner proportionate to the objective of encouraging compliance. For example, AMPs could be applied in the case of clear violations where other attempts to encourage compliance had failed. AMPs would also be tailored with respect to the relative size of firms.
The Artificial Intelligence and Data Act (AIDA) is just one of three pieces of proposed legislation in Bill C-27. Tabled by the Government of Canada on June 16, 2022, Bill C-27 would also introduce the Consumer Privacy Protection Act (CPPA) and the Personal Information and Data Protection Tribunal Act (PIDPTA).
The AIDA would be the first piece of legislation in Canada to regulate AI systems in the private sector. If passed, it would impose regulatory requirements for both AI systems generally and those AI systems specifically referred to as "high-impact systems."
From a policy perspective, the Government of Canada has positioned the AIDA as a regulatory tool to protect Canadians, ensure the development of responsible AI in Canada and prominently position Canadian firms and values in global AI development. To achieve these objectives the Government has sought to align the AIDA with existing Canadian legal frameworks, along with legislation and norms from other jurisdictions.
The specific AI systems subject to the AIDA, along with the required measures it will impose, will be defined by regulation.
For example, the key term "high-impact systems" is not defined in the AIDA itself; rather, it will be defined through criteria to be set out in regulations. Further, the measures that will be required to be implemented by persons responsible for high-impact systems will be set out in the regulations. An Administrative Monetary Penalty (AMP) scheme may also be set out via regulation.
Defining the AIDA 's requirements via regulation will allow the Government to respond to industry developments and update the specific systems regulated and measures to be implemented without legislative amendment. While this is efficient and enables swift policy adjustments, it means there is little guidance available within the text of the AIDA itself to assist industry with preparing to comply with the new regulatory system.
The clarifications provided in the Companion document, as described above, are therefore key to understanding and anticipating upcoming AI regulation in Canada.
The AIDA companion document provides early-stage insight to assist those that design, develop, offer or manage AI systems, with a view to helping them better understand the requirements of the AIDA. The draft AIDA and its requirements will continue to develop as Bill C-27 moves through the legislative process, and the industry consultations to which the Government has now committed take place.
For more information on the AIDA, we invite you to review our deep dive into the Act's provisions, as well as our one-page high-level summary. In the meantime, if you would like to discuss this topic further, please contact the authors or a member of Gowling WLG's Cyber Security and Data Protection Group.
NOT LEGAL ADVICE. Information made available on this website in any form is for information purposes only. It is not, and should not be taken as, legal advice. You should not rely on, or take or fail to take any action based upon this information. Never disregard professional legal advice or delay in seeking legal advice because of something you have read on this website. Gowling WLG professionals will be pleased to discuss resolutions to specific legal concerns you may have.