The AIDA would be the first piece of legislation in Canada to regulate the development and deployment of artificial intelligence (AI) systems in the private sector.
The purpose of the AIDA is twofold:
- To regulate international and interprovincial trade and commerce in AI systems by establishing common requirements applicable across Canada for the design, development and use of those systems; and
- To prohibit certain conduct in relation to AI systems that may result in serious harm to individuals or harm to their interests.
Who does it impact?
Generally, the AIDA will apply to persons carrying out a "regulated activity." A "person" is defined to include a trust, a joint venture, a partnership, an unincorporated association and any other legal entity." A regulated activity includes, in the course of international or interprovincial trade and commerce:
- Processing or making available for use any data relating to human activities for the purpose of designing, developing or using an AI system; or
- Designing, developing or making available for use an AI system or managing its operations.
Specific compliance requirements under the AIDA apply to five categories of persons. Persons may have responsibilities under more than one of these categories:
- Persons who carry on regulated activities
- Persons responsible for AI systems
- Persons responsible for high-impact AI systems
- Persons who make available for use a high-impact system
- Persons who manage the operation of a high-impact system
The AIDA, as drafted, does not define "high-impact system." Although this term, and many other key concepts, remain to be defined in regulations to come, in a companion document, the Government of Canada identified the following as examples of systems that are of interest in terms of their potential impacts:
- Screening systems impacting access to services or employment
- Biometric systems used for identification and inference
- Systems that can influence human behaviour at scale, such as AI-powered online content recommendation systems
- Systems critical to health and safety, such as autonomous driving systems and systems making triage decisions in the health sector
What is the impact?
The AIDA sets out different requirements for a variety of persons responsible for AI systems at various stages of their lifecycle:
Regulated activities under the AIDA include processing or making available for use data relating to human activities; designing, developing or making available for use an AI system; or managing an artificial intelligence system's operations.
Persons who carry out these regulated activities would be required to establish measures with respect to the manner in which data is anonymized, as well as the use and management of anonymized data. They would also be required to keep records with respect to the measures adopted.
- Harm and bias from high-impact AI systems
Persons responsible for designing, developing or making available for use an AI system would be required to conduct an impact assessment to determine whether that system is high impact.
Where a system is determined to be high impact, persons responsible for the high-impact AI system would be required to establish (and monitor compliance with) measures to identify, assess and mitigate risks of harm and bias.
- Publication of descriptions
Persons who make available for use a high-impact system would be required to publish on a publicly available website a plain-language description of that system. These descriptions must include explanations of how the system is intended to be used, the types of content that it is intended to generate; the decisions, recommendations or predictions that it is intended to make; and the measures established to mitigate the risks of harm or biased output that could result from the use of the high-impact system.
Persons who manage the operation of a high-impact system have a similar obligation, though the description only need include an explanation of the actual use of the system; the types of content it generates; and the decisions, recommendations or predictions the system makes.
Contraventions of the AIDA may generally result in administrative monetary penalties (AMPs) of as much as three per cent of global revenue or $10 million for non-compliance.
Commission of offences under sections 38 or 39 of the AIDA may result in AMPs of as much as five per cent of global revenue or $25 million or imprisonment of individuals.