top of page

Bill C-27: Major Reform to Canadian Privacy Law and New Law on Artificial Intelligence



The federal government has just tabled a bill that stands to significantly change information laws in Canada. Bill C-27, the Digital Charter Implementation Act, 2022, would do three things if passed:

  1. Replace PIPEDA, Canada’s existing federal private sector legislation in Canada, with the Consumer Privacy Protection Act (CPPA);

  2. Enact the Artificial Intelligence and Data Act; and

  3. Enact the Personal Information and Data Protection Tribunal Act to create an administrative tribunal to review decisions of the federal Privacy Commissioner.

Bill C-27 reflects the government’s second recent attempt to revamp federal privacy laws. In October 2021, the government introduced Bill C-11, which would have replaced PIPEDA with the CPPA but did not contain separate legislation on artificial intelligence. Bill C-11 was not enacted prior to last year’s federal election.

Among the things the CPPA would do, if enacted, are the following:

  • define several key terms that are currently undefined in PIPEDA, such as “anonymize” and “de-identify”;

  • clarify what it means for personal information to be under the “control” of an organization – another point that has caused confusion;

  • require an organization to collect, use, and disclose personal information only in a manner and for purposes that a reasonable person would consider appropriate in the circumstances, whether or not consent is required under the Act; in other words, even where an individual provides consent to a proposed collection, use, or disclosure of their information, the organization will still have to demonstrate that the collection, use, or disclosure is reasonable. While PIPEDA has long imposed this “reasonableness” test, the additional language providing that the reasonableness test applies whether or not consent is required is new;

  • set forth specific criteria for what valid consent involves, whereas PIPEDA only speaks generally about the need for valid consent but does not offer such criteria;

  • require consent to be in plain language, drawing on guidance from the Office of the Privacy Commissioner of Canada that emphasizes this point;

  • prohibit organizations from obtaining consent through false or misleading information or practices – language that dovetails with the Competition Bureau’s recent moves toward scrutinizing organizations’ representations to consumers about their privacy and security practices;

  • provide that express consent is not required to use personal information for certain business operations, including (1) an organization’s business activities (which are defined in the statute), (2) activities in which the organization has a legitimate interest, (3) the transfer of personal information by the organization to a service provider, and (4) to de-identify the information. Items (3) and (4) are longstanding points of confusion under PIPEDA, and the CPPA makes clear that they are situations that do not require express consent from the individual;

  • require organizations to dispose of an individual’s personal information in certain situations;

  • impose transparency requirements with respect to “automated decision systems,” which would encompass AI algorithms, machine learning, etc. This transparency would extend to giving individuals the right to an explanation for how a decision or recommendation was made by an automated decision system, where the decision could have a significant impact on the individual;

  • increase the monetary penalties that can be imposed on organizations that violate the statute’s requirements and would enhance the Privacy Commissioner’s powers to make orders; and

  • establish a Personal Information and Data Protection Tribunal, an administrative body that could hear appeals of orders from the Privacy Commissioner;

If enacted, the Artificial Intelligence and Data Act (AIDA) would require users of “high impact” AI systems to do the following:

  • establish measures to manage anonymized data (s. 6);

  • conduct an impact assessment of the AI system (s. 7);

  • develop a risk mitigation plan (s. 8);

  • monitor the mitigation of the risks (s. 9);

  • keep general records about the AI system (s. 10);

  • publish the description of the AI system (s. 11); and

  • notify the users in case of "material harm" (s. 12).

In addition, the AIDA would authorize the Minister of Industry to designate an AI and Data Commissioner (s. 33), to request information and records about an AI system, to require audits, and to stop the operation of an AI system should the Minister believe that the AI system poses a "serious risk of imminent harm".

The AIDA would contain the follow enforcement provisions:

  • Under s. 30, non-compliance with the legal requirements could lead to a fine of no more than $10 million and 3% of the person's gross global revenues in the financial year; and

  • Under ss. 38 and 39, the possession or use of data obtained through an offence under federal or provincial law, or the operation of an AI system knowing that it could likely cause physical or psychological harm or property damage, could lead to a fine of no more than $25 million and 5% of the gross global revenues in the financial year, as well as up to five years (minus a day) of imprisonment.

AIDA's approach, which builds upon "high-impact" AI systems, seems to reflect Canada's efforts to bridge the European Union's AI Act, which uses a risk-based approach, and the proposals in the U.S., which work around impact assessments. However, there is an overarching convergence in these three proposals: assess the risks and/or impacts, mitigate potential harms, document the process for potential audits, and continue to monitor the systems operating in the markets.

A noteworthy element of the AIDA is the empowerment of the Minister of Industry (ISED), combined with hefty fines and imprisonment as possible penalties under this law. The Minister is the one who “stops the buck,” with the power to collect information about AI systems and even stop an AI system from operating if deemed necessary. Further, the Minister's capacity to share "confidential business information" to comply with court orders or warrants (s. 24), or to share such information with specified government officials (e.g., Privacy Commissioner, the Canadian Human Rights Commission) (s.26), might raise concerns in the private sector.

Finally, much of the substance of the law remains undefined. The law does not define what a "high-impact system" is, or what is meant by "material harm," which triggers notification requirements. These terms will have to be defined by regulation, should the law pass.

For more information, contact Ira Parghi (iparghi@inq.law) or Carole Piovesan (cpiovesan@inq.law)


Want to stay ahead of the curve?

Join our Think INQ community to receive updates on upcoming courses & events and valuable information on privacy, health, data & business law - all in one convenient place.



11 views0 comments

Comments


bottom of page