This article was featured on CIGI Online.
The most curious aspect of the proposed law is also the most foundational thing about it: the overarching governance arrangement.
It takes a little while after a new piece of legislation is introduced to digest the substance. In the weeks since the Artificial Intelligence and Data Act (AIDA) was tabled as a part of Bill C-27’s suite of updates on privacy laws affecting the Canadian private sector, numerous articles have appeared discussing its nuts and bolts. Typically, their authors include a description of the AIDA as a new regime for governing artificial intelligence (AI) systems, its focus on the prevention of harm, and the severe penalties for contravention it establishes. They overview the major provisions relating to anonymization of data and the specific requirements for high-impact systems, and often pay as much attention to what’s missing as to what’s included.
Drafted by Innovation, Science and Economic Development Canada (ISED), the AIDA has a structure akin to a skeleton that will be fleshed out with regulations to follow. Its twin objectives of regulating commerce and preventing harm echo those of other global initiatives, in particular the European Union’s proposed Artificial Intelligence Act (AIA). But unlike the AIA, which is comprehensive, detailed and prescriptive, Canada’s AIDA remains high-level, with critical elements undefined in the legislation.
Intriguingly, it is the Governor in Council (that is, the Cabinet) who has the authority to make regulations concerning a series of questions that will fundamentally determine the character of future AI-related policy for the private sector in Canada.
These questions surround identifying:
- what constitutes “biased output” from an AI system;
- what defines a “high-impact system”;
- measures relating to anonymization of data, AI system risk and its mitigation;
- the assessment of AI system impact;
- the definition of what constitutes material harm;
- rules for engaging independent auditors, and
- rules around the disclosure of information obtained under the AIDA.
The fact that answers to these industry-defining questions have not been articulated in the AIDA is likely to be a point of tension in the coming public discourse. For some observers, the decision to embed so many meaningful definitions into future regulations may take on the appearance of a “punt.” To others, it may be interpreted as a calculated effort to introduce flexibility into a dynamic and evolving policy area. Either way, while the term high-impact system remains undefined, there can be no meaningful understanding of what this legislation proposes to regulate. As a result, this aspect of the AIDA at first reading may be seen as debatable, with puts and takes on either side, or as representing something more troubling — a set of legally entrenched rules for the regulation of something undefined.
The most curious aspect of the proposed law is also the most foundational thing about it: the overarching governance arrangement. A single ministry, ISED, is proposed as the de facto regulator for AI in terms of law and policy making and administration and enforcement.
This set-up stands in sharp contrast to the case of privacy law, where ISED drafts legislation and associated policy but there are separate entities for administration and enforcement. For example, under Canada’s current private sector privacy law (the Personal Information Protection and Electronic Documents Act), the privacy commissioner acts as a non-partisan and independent ombuds, reporting to Parliament. The privacy commissioner interprets privacy law and policy and investigates cases. Under Bill C-27, the model for ensuring compliance shifts the privacy commissioner from ombuds to enforcer, with broad order-making powers. It also establishes the Personal Information and Data Protection Tribunal to receive recommendations from the privacy commissioner, make final determinations and enforce policy under the proposed Consumer Privacy Protection Act.
In the case of the AIDA, however, ISED drafts, interprets and enforces the legislation. Further, the AIDA states that “the Minister may designate a senior official of the department over which the Minister presides to be called the Artificial Intelligence and Data Commissioner, whose role is to assist the Minister in the administration and enforcement of this Part.” There is no independence from ISED or separation of roles.
So, what does best-in-class governance for AI look like? In 2014, the Organisation for Economic Co-operation and Development (OECD) published a guide, The Governance of Regulators, which stresses the importance of independent regulatory decision making, conducted at arm’s length from the political process in instances where perception of impartiality drives public confidence and where the decisions of the regulator could have a significant impact on particular interests. Tellingly, in its chapter on role clarity, the OECD guide addresses the case of government departments that are service delivery organizations, arguing there is potential conflict of interest when they also oversee the enforcement of standards: “Combining the functions of service delivery or the funding of external providers with enforcement of regulatory standards can also present conflicts, particularly when the same staff carry out both functions and report to the same decision maker, and therefore should be avoided. These conflicts may arise because rigorous enforcement of regulatory standards can affect supply of a government service or delivery costs.”
This consideration is material to ISED, whose mandate includes a commitment to “continue to support the economic growth and recovery of Canada’s traditionally strong industries, including but not limited to automotive, aerospace, natural resources and agri-food, to increase productivity and innovation, and to strengthen the manufacturing base of Canada.”
ISED is a major government player in programs to support industry. Consider, for example, the Global Innovation Clusters program, whereby the Government of Canada, through ISED, is providing approximately $1 billion over five years, matched by industry, to boost innovation and growth. Two of these clusters, Digital Technology and AI-Powered Supply Chains, relate directly to the technology the AIDA proposes to regulate. Moreover, we can expect that AI will penetrate most industries over time. It’s reasonable to ask how a ministry that collaborates with and funds digital and AI-enabled industries can serve as an impartial enforcer of the design, development and use of this same technology by its constituent clients.
Canada is to be commended for being among the first countries in the world to introduce AI-related legislation, and for constructing a set of draft requirements relating to the development and use of AI systems. The AIDA is still at first reading and we can expect it will undergo revision as it moves through the legislative process. That’s a good thing. In addition to the search for answers to the inevitable questions around the rules and requirements for AI, there should be a healthy and transparent discussion of the institutional arrangements by which the legislation and ensuing policy are to be governed. Obvious questions include:
- Is ministerial authority over policy, administration and enforcement of AI-related activity in the Canadian private sector appropriate?
- Is there an available legal remedy for individuals and entities who want to challenge decisions made by the minister or a ministerial designee, as there would be with privacy-related matters?
- Why has the government proposed a new AI and data commissioner rather than augmented the mandate of the existing privacy commissioner?
- Why has a different governance structure been proposed for the oversight of AI systems in Canada, as opposed to using the structure for privacy law?
These questions reflect the potentially conflicting functions within a single government department in relation to the AIDA and derive their importance both from ISED’s crucial role as Canada’s AI industry rapidly expands and from the need for transparency in decision making, accountability in decisions and access. Put simply, the dual role of consumer protection and industry development may pose — and certainly appears to pose — a tricky balance for ISED in this context.
ABOUT THE AUTHOR
Mardi Witzel is an associate with NuEnergy.ai and is focused on ESG (environmental, social and corporate governance) and AI governance, and the special challenges facing high-growth firms.
This article first appeared at www.CIGIonline.org. (Link)
Click here to follow us on LinkedIn to keep up to date with new content.