Legal Risks Associated with Automated Hiring Tools in Canada - TRC-Sadovod LLP
Insights Header image
Insights Header image
Insights Header image

Legal Risks Associated with Automated Hiring Tools in Canada

July 13, 2023 Employment & Labour Relations Bulletin 9 minute read

On July 5, 2023, New York’s new law regulating automated employment decision tools (“Local Law 144”) came into force. Among other things, Local Law 144 will require employers that use automated employment decision tools to conduct independent bias audits and notify employees and prospective hires of their use of such tools.[1]

We have received a number of questions about how automated hiring tools are regulated in Canada. This bulletin will provide an overview of the attractions and risks of using automated hiring tools, including a summary of two upcoming laws which will have a significant impact on the use of such tools (those being Québec’s Act 25[2] and Bill C-27[3]).

What are Automated Hiring Tools, and What Makes Them Attractive?

For the purposes of this bulletin, we use the term “automated hiring tools” to refer to any tools which assist in hiring decisions, whether or not a human reviewer is involved. There is a great range of automated hiring tools, with varying degrees of human intervention. For example,

  • Targeted job advertisements may use algorithms to determine the best place to advertise job opportunities, which may influence the pool of applicants.
  • Resume screening tools can eliminate resumes according to certain requirements or stipulations.
  • Intelligent applicant tracking systems analyze application materials to estimate how a candidate might perform on the job based on keywords, past employee data, or algorithms.
  • AI-powered video interviewing tools advertise the ability can assess candidates based on facial expression analysis.

There are two main incentives for adopting automated hiring tools: to increase efficiency and to reduce bias.

Particularly in industries with low barriers to entry, companies may receive hundreds or thousands of job applications—at times too many for the hiring team to review in detail. Some sources have suggested that the volume of job applications will increase as more applicants use free generative AI tools such as ChatGPT to assist in drafting cover letters or resumes.[4] Automated hiring tools can save costs and reduce reliance on arbitrary methods of prioritizing candidates (such as time of application, or first letter of applicant’s name). Ideally, such tools can rank candidates in a manner that brings the best candidates to the top of the list.

Automated hiring tools also have the potential to reduce bias, and lead to better hiring outcomes, if properly managed. Studies have demonstrated that human hiring decisions are often influenced by unconscious bias.[5] Automated hiring tools may be able to reduce bias significantly (though, as discussed below, without proper management, they may create or reinforce bias as well).

Automated Hiring Tools Come with Risks

Along with their potential benefits, the use of automated hiring tools may give rise to legal risks. For example:

  • Privacy laws may apply to the use of such technology insofar as the systems use personal information.
  • Human rights laws may apply insofar as the use of these systems could lead to discrimination.
  • Canada has laws related to automated decision tools coming into force in Québec on September 22, 2023, brought on by Québec’s Act 25.
  • Additional requirements for automated decision, prediction or recommendation tools are set out in the current draft of the Consumer Privacy Protection Act (“CPPA”), which forms a part of Bill C-27, currently under consideration in Parliament.[6]
  • Finally, Canada’s proposed Artificial Intelligence and Data Act (the “AIDA”), which also forms a part of Bill C-27, may apply to automated hiring tools, insofar as such tools meet the AIDA’s definition of “AI System.” A previous TRC-Sadovod bulletin addressed the timeline and enforcement of AIDA.

Automated Hiring Tools May Trigger Obligations under Privacy Laws

In Canada, where privacy laws apply to the employment relationship, employers must ensure that their use of automated hiring tools complies with privacy laws.[7] In particular, employers must ensure they obtain valid consent where required, protect personal information with appropriate safeguards, and limit the collection, use and disclosure of personal information to appropriate purposes.

Canadian privacy laws generally require organizations to obtain consent in order to collect, use or disclose personal information. Analyzing an individual’s application materials with an automated hiring tool would likely require consent. Furthermore, Québec’s privacy regulator has previously ruled that the use of an algorithmic prediction system to generate a “score” for an individual constitutes a new collection of personal information, which would also require fresh consent.[8]

There are exceptions to these consent requirements under the Personal Information Protection and Electronic Documents Act (“PIPEDA”), and substantially similar privacy laws in British Columbia and Alberta, for any collection, use or disclosure of personal information that is reasonable (under PIPEDA, “necessary”) to establish, manage or terminate the employment relationship. Whether the use of automated hiring tools to assess a candidate would be reasonable or necessary in order to establish an employment relationship is not clear, and would likely depend on the context. However, the use of an individual’s application materials to train an automated system for future use would likely fall short of these exceptions and require employee consent.

Automated Hiring Tools May Lead to Exposure Under Human Rights Laws

One significant risk of using an automated hiring tool is the risk of bias and discrimination in the outcome. Bias can lead to worse hiring decisions, and liability under applicable human rights laws, which prohibit discrimination in employment based on certain protected grounds, such as race, ethnic origin, gender identity, age, etc.

Automated hiring tools are only as good as the dataset used to train them. If there is bias in the underlying training data, this bias may be amplified in the results. For example, a system trained on the application materials of past successful candidates may favour the demographic group that is most represented in the workplace. This bias may be difficult to detect, since an automated hiring tool may find other information that can be used as a proxy for demographic group. For example, a machine-learning based algorithm may prioritize certain candidates based on location (which could act as a proxy for membership in a certain cultural or racial demographic), or language choice (which may indirectly correlate with gender or cultural background).

Where the underlying decision-making formula is too complex or not readily discernable (as with black-box AI systems), it may only be possible to evaluate a system’s biases with an audit performed by professionals. Employers that make adverse hiring and employment decisions based even in small part on protected grounds will be unlikely to avoid human rights liability by placing blame on an external automated hiring tool developer. It is therefore essential for employers to ensure their automated hiring tools are carefully and regularly assessed.

Forthcoming Transparency Laws may Apply to Automated Hiring Tools

The problem of transparency will become more pressing as new privacy laws come into force, particularly transparency requirements under section 12.1 of Québec’s Act respecting the protection of personal information in the private sector (the “Québec Act”) (which comes into force on September 22, 2023), and sections 62 and 63 of the proposed CPPA, which is currently under consideration in Parliament.

Section 12.1 of the Québec Act (September 22, 2023)

When section 12.1 of the Québec Act comes into force, Québec employers that use personal information to render a decision based exclusively on automated processing must inform prospective hires of this fact not later than the time the decision is communicated to them. Upon request, the employer must also inform the employee or applicant of:

  1. the personal information used to render the decision;
  2. the principle factors and parameters that were used to render the decision; and
  3. the right of the individual to have the personal information used to render the decision corrected.[9]

In addition, the employee or applicant must be given an opportunity to submit representations to a person in a position to review the decision.

Importantly, these requirements only apply to decisions rendered exclusively by automated processing. If a human being meaningfully participates in the decision-making process, the requirement will not apply.

One open question is whether this notice requirement will apply where an employer rejects a prospective employee’s job application without notice to the applicant. Based strictly on the wording of the legislation, the requirement would never be triggered if the individual is never informed of the non-hire decision.

Sections 62 and 63 of the Consumer Privacy Protection Act

As a starting point, the CPPA would only apply to a subset of employment relationships. Like PIPEDA (the legislation it would replace), the CPPA would only apply to applicants for work in a federally regulated workplace, or independent contractors in provinces without substantially similar privacy legislation.[10]

Under the current proposed draft CPPA, an “automated decision system” is defined as any technology that assists or replaces the judgment of human decision-makers through the use of a rules-based system, regression analysis, predictive analysis, machine learning, deep learning, a neural network or other technique.[11] If Bill C-27 is passed in its current form, section 62 of the CPPA would require organizations  to make readily available information about the organization’s use of automated decision systems to make predictions, recommendations or decisions about individuals that could have a significant impact on them.[12]

Furthermore, if an organization uses an automated decision system to make a prediction, recommendation or decision about an individual that could have a significant impact on them, the organization must provide an explanation of the decision upon request by the individual.[13] The explanation must include:

  1. the type of personal information used to make the prediction, recommendation or decision;
  2. the source of the personal information; and
  3. the reasons or principal factors that led to the prediction, recommendation or decision.

The CPPA’s requirements differ significantly from those outlined in section 12.1 of the Québec Act. First, the CPPA has a broader scope, as it covers automated hiring tools that either replace or assist human decision-makers and includes predictions and recommendations in addition to decisions. Thus, it applies to the use of automated hiring tools even when a human is “in-the-loop.”

However, it is narrower than section 12.1 of the Québec Act because it only applies to the use of automated decision systems that will have a significant impact on individuals. While the term “significant impact” is not defined, it is difficult to conceive of a more significant impact than a decision regarding an individual’s employment, so this narrower scope may not have much impact on the way the laws apply to automated hiring tools.

Furthermore, unlike section 12.1 of the Québec Act, which requires individual notifications by employers, the CPPA would require general public disclosure from employers, but place the responsibility on applicants or employees to request further information about an employer’s use of automated decision systems as it pertains to them. The CPPA would also differ from 12.1 of the Québec Act in that it would not provide individuals with the right to have the decision (or prediction or recommendation) reviewed.

Automated Hiring Tools Could Be Deemed “High Risk” under the AIDA

If passed in its current form, the AIDA may impose significant obligations on employers using automated hiring tools, if those tools meet the definition of “AI System” under the AIDA.[14]

As we described in our bulletin on the topic, ISED’s companion to the AIDA specifically identified “screening systems impacting access to services or employment” as an area of interest to the government. Given the severity of impact, imbalance of economic power of affected individuals, and inability to opt-out, AI systems used in the hiring process could well constitute “high impact” systems under AIDA, which would make such systems subject to a host of (yet unspecified) requirements related to human oversight, monitoring, transparency, fairness and equity, safety, accountability and/or validity and robustness.[15]

With that said, there are some unresolved questions about the application of the AIDA to employment relationships, given the division of powers between the federal and provincial governments, and the wording of the AIDA itself. Most requirements of the AIDA are confined the international or interprovincial trade and commerce. While the AIDA will likely apply to the commercial development and sale of employment-directed AI systems, it is not clear if it will extend to an employer’s use of such systems in an employment setting.

Key Takeaways

Automated hiring tools present Canadian employers with advantages and risks. These tools offer increased efficiency and the potential to reduce bias in the hiring process. However, they may also give rise to privacy and human rights risks. It is crucial for employers to ensure compliance with privacy laws, obtain necessary consent, and mitigate the risk of discrimination with appropriate audits and legal reviews.

New transparency laws, such as section 12.1 of the Québec Act and sections 62 and 63 of the proposed CPPA, will impose additional requirements on employers using automated hiring tools. Furthermore, the forthcoming AIDA may impose additional obligations on the use of automated hiring tools. Consequences for failing to comply with the above laws could include litigation, regulator investigations, reputational impact, and administrative monetary penalties or fines. Employers should closely monitor legal developments and assess the potential impact of these tools on their hiring practices.

If your business is developing or using automated hiring tools, TRC-Sadovod’s technology and employment law teams are available to help you assess and minimize your legal risks.

[1] Simone R.D. Francis and Zachary V. Zagger, “New York City Adopts Final Rules on Automated Decision-making Tools, AI in Hiring” (April 2023) National Law Review, available here.
[2] An Act to modernize legislative provisions as regards the protection of personal information, SQ 2021, c 25. [Act 25]
[3] Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, short titled the Digital Charter Implementation Act. [Bill C-27]
[4] See for example, Morgan Smith, “ChatGPT can help you write a standout CV in seconds, job experts say: It’s ‘the ultimate resume-writing cheat code’” (March 22, 2023), CNBC, online here.
[5] Truc Ngyuyen, “What’s in a name? We talk to experts about racial bias in hiring and how to work to change it” (September 13, 2018), CBC, online here.
[6] Government Bill (House of Commons), C-27 (44-1) First Reading (June 16, 2022), Part I, the Consumer Privacy Protection Act, available here. [CPPA]
[7] Currently federally regulated employees, independent contractors, and employees in British Columbia, Alberta and Québec are subject to private sector privacy laws, although privacy concepts are applicable to the employment relationship in other provinces through privacy torts (for example, see previous TRC-Sadovod bulletins on Ontario’s false light publicity tort and intrusion upon seclusion tort).
[8] Centre de services scolaire du- Val- des- Cerfs (anciennement Commission scolaire du Val-des-Cerfs), Commission d’accès à l’information du Québec, 1020040-S, available here.
[9] Québec Act, s. 12.1, as enacted by Act 25 (in force, September 22, 2023).
[10] CPPA, s. 6.
[11] CPPA, s. 2.
[12] CPPA, s. 62.
[13] CPPA, s. 63 (3).
[14] Currently, the definition of AI System is “a technological system that, autonomously or partly autonomously, processes data related to human activities through the use of a genetic algorithm, a neural network, machine learning or another technique in order to generate content or make decisions, recommendations or predictions”: Bill C-27, text as of April 24, 2023, Part 3, s. 2, s.v., “artificial intelligence system”, available here.
[15] Innovation, Science and Economic Development Canada, The Artificial Intelligence and Data Act (AIDA) – Companion document (March 13, 2023), available here.

by Robbie Grant, Ioana Pantis, and David Adjei (Summer Law Student)

A Cautionary Note

The foregoing provides only an overview and does not constitute legal advice. Readers are cautioned against making any decisions based on this material alone. Rather, specific legal advice should be obtained.

© TRC-Sadovod LLP 2023

Insights (5 Posts)View More

Featured Insight

Corporate Counsel CPD Webinar | Essential Leadership Practices: Supporting the resilience, engagement, and impact of your team

Join professional coach and certified stress management educator, Marla Warner, for an engaging program that will help you focus on elevating performance outcomes, while supporting your team’s engagement and wellbeing. You will learn how to foster trust and respect in your team, the benefits of “coaching”, and why gratitude, empathy and compassion are the superpowers for leaders in 2023 and beyond.

Details
Friday,  November 24, 2023
Featured Insight

TRC-Sadovod’s Employment and Labour Webinar 2023

Join us for TRC-Sadovod's annual Employment and Labour Webinar as we review and discuss current trends, emerging employment legal issues and provide practical solutions to help you manage your workforce.

Details
Thursday, November 30, 2023
Featured Insight

Enforcing Arbitration Agreements: Ontario Superior Court Raises a ‘Clause’ for Concern

This bulletin discusses a recent decision that found that an arbitration clause that contracts out of applicable employment standards legislation is invalid.

Read More
Nov 8, 2023
Featured Insight

Transparency for Talent: Proposed Legislation Would Mandate Salary Range and Artificial Intelligence Disclosure in Hiring Process

Ontario will propose legislation aimed at providing additional transparency to Ontario workers, including salary ranges and use of artificial intelligence.

Read More
Nov 8, 2023
Featured Insight

Environmental Obligations Trump Lenders: The Trend Continues

Re Mantle Materials Group, Ltd continues a recent trend in Alberta in which environmental remediation obligations are found to have a super priority.

Read More
Nov 8, 2023