Recruiters, TA, and HR pros, here’s what you need to know about the NYC AI Hiring Law.

What is the NYC AI hiring law?

New York City’s Department of Consumer and Worker Protection (DCWP) now requires annual bias audits if an organization uses an “Automated Employment Decision Tool” (AEDT) to “substantially assist or replace discretionary decision making” in hiring decisions.

They define an AEDT as “any computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues simplified output, including a score, classification, or recommendation. The full content of the law can be found here.

Who is subject to NYC AI hiring law?

The law applies only to employers and employment agencies that use an AEDT “in the city”. This means:

  • The job location is an office in NYC, at least part-time, OR
  • The job is fully remote, but the location associated with it is an office in NYC, OR
  • The location of the employment agency using an AEDT is NYC, or if the location of the employment agency is outside NYC, one of the bullets above is true.

Is Glider AI considered an AEDT?

No, Glider is not an AEDT. This law is aimed at preventing automated hiring decisions, without human intervention, based on potentially biased AI models. No Glider product, module, or functionality makes or enables automated hiring decisions by any method. Glider’s platform complements other screening inputs such as relevant education, work experience, and interview responses, among other potential considerations, which ultimately require a human to interpret and make outcome decisions. Even the Glider platform itself requires humans to make final candidate disposition decisions.

Glider AI and artificial intelligence?

Artificial Intelligence in Glider AI is specific to AI Proctoring and automating basic and repetitive tasks like note-taking. Any form of candidate shortlisting, declination, or hiring decisions requires a manual review of assessment results and human intervention to determine candidate disposition.

Glider AI’s advanced AI proctoring detects suspicious actions during a skill assessment, where the interviewing team receives a summary highlighting the concern. Candidates taking Glider skill assessments get complete transparency about permissible and non-permissible behaviors. 

What if NYC requests bias audit results?

Your company may use other enabling technologies that New York City deems applicable to Local Law 144, prompting a request for your bias audit results. We anticipate that there will be some initial imprecision and inconsistency with how AEDT determinations are made in 2023 and 2024 until officials are more experienced in enforcement and more familiar with talent acquisition technologies.

Help with the NYC AI hiring law?

If a New York City official requests audit results specific to the use of Glider, contact Glider’s Vice President of Operations, Ben Walker, at ben.walker@glider.ai. Glider will collaborate with your company to conduct a joint demonstration of the Glider platform with NYC officials to resolve any confusion or misinterpretation that we are an AEDT and that your use of Glider subjects you to Local Law 144. 

What’s the future of AI Recruiting?

With AI so pervasive in business applications and evolving at a pace that’s hard to keep up, regulations like the NYC AI Hiring law will continue to emerge and evolve to ensure AI is used fairly and equitably. Glider embraces sound legislation aimed at protecting individuals and groups from potential AI bias and ensuring the ethical application of AI in software. The use of AI recruiting will continue to expand and with it, unforeseen challenges.

While AI offers many benefits, we must critically evaluate and fully understand any and all possible unintended consequences, no matter how unlikely.  Glider AI will continue providing full transparency to customers and candidates where AI is used in our platform and to fulfill our motto of “Quality first, bias never, integrity always.”

Related posts