The New York City Department of Consumer and Worker Protection (DCWP) recently adopted final rules on New York City’s (NYC’s) new law (Local Law 144) regulating Automated Employment Decision Tools (AEDTs). Specifically, NYC employers and employment agencies that use artificial intelligence (AI) decision-making tools, such as AEDTs, in hiring or promotion should understand the new law and final rules. As employers throughout the U.S. begin to implement AI tools in their employment decision-making processes, states and local municipalities are racing to regulate the practice. Federal agencies have also begun examining the implications of such technology. In January 2023, the U.S. Equal Employment Opportunity Commission (EEOC) held a public hearing to discuss the implications of artificial intelligence in employment.

New York City’s Local Law 44

Under Local Law 44, NYC employers may not use AI employment decision-making tools like AEDTs unless they take specific steps to audit and control the tool. These steps include:

  • performing an independent bias audit on the tool within one year of its use,
  • making the results of the audit open to the public,
  • informing employees and candidates that an AEDT will assist in an assessment or decision, and
  • providing instructions on how to request alternative evaluation methods or a reasonable accommodation.

Final Rules Expand the Definition of AEDTs

Local Law 44, however, did not cover what qualifies as an AEDT, what is required during an independent bias audit, or how employers can fully comply. The DCWP’s final rules answer these questions and offer clarity in implementing the law. Specifically, the final rules expand on Local Law 44’s original definition of AEDTs and provide guidance for independent bias audits. Under the law, an AEDT is a “computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation.” The law covers those tools meant to “substantially assist or replace” discretionary employment decision-making. The final rules, however, expand on the source of the computational process to include mathematical, computer-based techniques that:

  • generate a prediction, expected outcome, or classification based on aptitude; and
  • identify inputs, assign relative importance to those inputs, and use other parameters to improve the accuracy of the prediction or classification.

Clarifying Independent Bias Audits

Furthermore, employers must subject the tool to an independent bias audit to evaluate any disparate impact on protected classes. Under the final rules, independent bias audits must:

  • calculate the selection rate for every demographic;
  • reveal the impact ratio for each demographic;
  • separately calculate the impact ratio for sex, race/ethnicity, and the intersectional categories of sex, race, and ethnicity;
  • make all calculations for each classification group (e.g., leadership style, core strengths, etc.); and
  • reveal the number of individuals assessed that did not fall into a specific category.

Additionally, an independent audit may exclude a demographic category that represents less than 2% of the data related to the impact ratio. The audit’s results must provide justification for the exclusion, the number of applicants excluded, and their scoring rate. As a part of the DWCP’s final rules, the agency delayed enforcing Local Law 144 until July 5th, 2023.