New artificial intelligence tools have the potential to significantly benefit staffing companies. Among other advantages, they may be able to guide employment decisions, such as hiring employees and optimizing the placement of workers. They might also be able to offer significant economies of scale when managing large numbers of employees and placement opportunities that aren’t required by smaller employers.

Like any promising new technology, however, AI tools are not without legal risk. A wide array of federal, state and local regulators is scrutinizing their use in how staffing companies may make employment decisions, and momentum is building in favor of increased regulation.

For staffing companies seeking clarity, there are still few concrete regulations. But a review of federal and state regulators as well as state legislatures and city councils may help.

The most comprehensive regulation is New York City’s Local Law 144, which the city began enforcing on July 5 and carries monetary penalties for noncompliance. The law regulates the use of “automated employment decision tools,” which are defined as technology “that issues simplified output, including a score, classification or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” Employers using these tools are required to engage an independent auditor to conduct a “bias audit” of the tool, make the results of the audit publicly available, provide notice to each candidate or employee before the tool is used to evaluate or assess the individual, and maintain records about the data collected.

Beyond New York City, many other jurisdictions are regulating or considering regulating the use of AI in employment decisions:

  • Illinois’ Artificial Intelligence Video Interview Act requires employers using AI tools to analyze video interviews to provide notice to and obtain consent from applicants. Applicants may also request that the videos be deleted, and employers’ ability to share the videos is limited.
  • Maryland’s Labor and Employment Code Section 3-717 similarly requires written consent from the applicant if an employer uses a facial recognition service to create a facial template during an applicant’s interview.
  • Legislatures in Massachusetts, California, New York, New Jersey, Vermont and Washington DC are also considering laws regulating the use of AI for employment decisions.

Federal employment agencies are also focusing on how the laws they enforce are impacted by AI-based employment tools. It remains to be seen whether any new federal regulations will impose new burdens on employers or will merely clarify how preexisting legal requirements apply to this new technology.

The Equal Employment Opportunity Commission designated employers’ use of AI as a priority in its latest Strategic Enforcement Plan published in January. In May, the EEOC followed up with a Q&A on AI issues that explains the EEOC’s position that the Uniform Guidelines on Employee Selection Procedures, adopted in 1978, would apply to AI tools used in employment decisions just as they apply to other, more traditional employment screens, such as aptitude or personality tests.

These guidelines can help employers determine if their selection procedures — whether traditional or technology-driven — create an unlawful disparate impact. Similarly, the National Labor Relations Board has explained that employers cannot use AI tools to engage in surveillance of employees that would not be permitted under federal labor law. Employers must understand that they cannot use AI tools to do things in the workplace that would be unlawful when done using more traditional methods.

For ideas on how staffing companies take action now approaches, please see SIA’s June 2023 article “The AI Quagmire.”

While much of the coming regulation is likely to focus on clarifying how existing legal requirements apply to AI tools, one significant difference between AI tools and older selection tools such as tests and applications is that applicants and employees may have little or no notice that AI tools are being used. This information imbalance may lead to a greater regulatory emphasis on disclosure and transparency, as featured in New York City’s Local Law 144 and Illinois’ Artificial Intelligence Video Interview Act.

As the use of AI evolves, so will regulations. Staffing firms should make it a priority to closely monitor changes at the state and local levels.