On May 12th, 2022, the U.S. Equal Employment Opportunity Commission (EEOC) released a technical assistance document covering how algorithmic hiring tools may disadvantage disabled applicants. In brief, algorithmic hiring tools are artificial intelligence software-based screening tools that help employers save time and effort during the application process. However, even if they are labeled bias-free, these tools are not perfect and can lead to discrimination. The technical assistance document warns employers and hiring managers of the potential perils of algorithmic hiring tools. In addition, it offers tips to employers on complying with the Americans with Disabilities Act (ADA). Earlier EEOC guidance educated employers on how the ADA covers individuals undergoing treatment for opioid use disorder.
Background on Algorithmic Hiring Tools
Employers have a variety of modern, computer-based tools available to help them screen applicants, hire workers, and determine pay. These tools allow a faster, more convenient application and hiring process. What’s more, these algorithmic hiring tools may even allegedly increase objectivity and decrease bias. Algorithmic hiring tools may consist of one or more of the following:
- Software – which may include automatic resume-screening tools, hiring software, a chatbot (a computer program that simulates a conversation without the need for a live person), and video interviewing.
- Algorithms – this is a set of programmable computer instructions that allow it to process and evaluate data, make ratings, and screen applicants based on determined parameters.
- Artificial Intelligence (AI) – a programmable, machined-based system that can make predictions or recommendations based on input.
Information on Algorithmic Hiring Tools and the ADA
The EEOC’s technical assistance document advises employers who use algorithmic hiring tools on how to ensure that they do not inadvertently discriminate against applicants with qualifying disabilities. According to the document, the most common ways employers who use algorithmic hiring tools may violate the ADA are:
- When the employer does not consider a reasonable accommodation that may be necessary for the algorithm to rate the disabled applicant fairly and accurately.
- If the employer relies on an algorithmic hiring tool that “screens out” (or lowers an applicant’s performance based on criteria, causing them to lose out on the job) because of their disability. This may occur even if the applicant could have performed the essential functions of the job with reasonable accommodation.
- Adopting an algorithmic hiring tool that asks applicants illegal disability-related questions or requests medical examinations.
How Employers May Reduce the Chances of ADA Discrimination
Employers can take several actions to reduce the software’s chances to screen out a disabled applicant inadvertently. First, employers can ask the vendor where they bought the software if it was developed with individuals with disabilities in mind. Second, employers can take steps to ensure the software doesn’t screen out disabled applicants. These may include:
- indicating to users that reasonable accommodations, like alternative formats and tests, are available;
- including clear instructions for requesting accommodation; and
- being transparent with all applicants about how the software measures and evaluates applicants based on criteria. Include how a disability may potentially lower assessment results.
In addition to the aforementioned best practices, employers should also train staff on ADA compliance during the hiring process. Hiring staff should recognize and expeditiously process requests for reasonable accommodation. Finally, staff should develop alternative means for rating job applicants whenever the current process is inaccessible or disadvantageous to an applicant with a disability.