More employers are using automated recruiting tools to assist with the volume of incoming resumes. These tools sort through resumes, search for key words, and identify potential candidates for jobs posted, among other tasks. Concerns have been raised, however, that such automated tools may inadvertently and unintentionally discriminate against job applicants in violation of various employment discrimination laws.
In May, the Equal Employment Opportunity Commission (“EEOC”) issued guidance on business use of artificial intelligence (“AI”) tools for recruiting and other employment-related purposes. And this week, a New York City law seeking to reduce potential bias in automated hiring (the New York City Automated Employment Decision Tools (“AEDT”) Law) went into effect. At its core, the law requires employers who use AI tools to assist with employment decisions, including hiring, to audit these tools for potential bias and provide certain public notices that they are being used. Other localities are likely to follow suit and implement similar requirements.
The implementing rules were only recently finalized and the law is very technical.* However, an AEDT is generally a program that:
- uses AI-type technology (machine learning, modeling, analytics, or AI)
- to generate simplified output (a score, classification, or recommendation)
- that is used to drive hiring decisions, overriding or outweighing other factor
Examples of AEDT include resume-screening software, applicant tracking systems, and interview bots (chat bots that perform initial screening interviews).
There are nuances to this definition, so employers of all sizes may need to ask probing questions of their technology vendor and possibly consult with counsel who can take a closer look at the law and technology to see what is covered under the NYC law.
The NYC law carries potentially large penalties for employers who fail to comply with its requirements. It is therefore vital that employers familiarize themselves with the legal requirements.
Here are 4 steps companies can take to understand and manage the risks of using AI tools for recruiting and other employment decisions:
- Identify the software tools the company relies on for recruiting or other employment purposes and understand how each is being used
Most companies, even smaller ones, are using recruiting tools that rely on some amount of automation. After all, the whole point of using a recruiting tool is to maximize efficiency for the HR team.
- Determine whether the technology is an “Automated Employment Decision Tool” (AEDT)
A web search isn’t going to tell you whether your recruiting tool is an AEDT. Consult with employment counsel and your technology vendor to determine whether the software falls under the NYC law’s ambit.
- If one or more of the company’s automated tools fall into the definition of AEDT, these AI tools are required to undergo an annual “bias audit” by an independent auditor, so one should be consulted to conduct the required audit
Since there is some nuance associated with defining the scope of the audit and documenting this decision, companies may want to consult with counsel to better understand this technical aspect of the rules.
- Understand the company’s notice obligations and ensure compliance
The law and rules outline the notice requirements regarding the company’s use of these automated tools in detail. Consult with counsel to ensure your notices are compliant with these technical requirements and review all sites where the notices are posted to ensure the details are re-posted everywhere applicants will see them.
The NYC law and EEOC guidance make clear that while new technology carries the promise of simplifying and streamlining business operations, there are costs and risks that need to be considered. Employers should understand the technology and applicable legal requirements so they can make informed decisions.