As we have previously written, with artificial intelligence (AI) tools and machine learning continuing to evolve and improve, organizations are relying increasingly on such tools for assistance in recruiting, screening, and hiring prospective job candidates in an ever more demanding job market.
While such tools can be invaluable, they can also raise employers’ risk of disparate impact discrimination claims. In fact, in June 2023, the U.S. Equal Employment Opportunity Commission (EEOC) issued guidance on this point, explaining that, “without proper safeguards,” the use of AI in employment decision-making may give rise to claims of civil rights law violations.
In this context, employers should pay close attention to a recent EEOC settlement of a case in which the agency sued a provider of online tutoring services for alleged age discrimination based on the employer’s use of AI tools to make screening or hiring decisions.
Specifically, on August 9, 2023, iTutorGroup, Inc. and the EEOC filed a Joint Notice of Settlement in the U.S. District Court for the Eastern District of New York to resolve age discrimination claims the agency brought against the China-based online tutoring company. Pursuant to the consent decree settling the case, iTutorGroup agreed to pay $365,000 to a class of more than 200 applicants over 55 years old that were allegedly passed over because of their age.
The settlement resolves the claims brought by the EEOC in a May 2022 lawsuit against a group of companies that provide online English-language tutoring services to students in China. The EEOC alleged that iTutorGroup used AI software which screened out female job applicants over the age of 55 and male job applicants over the age of 60 who applied to work as an online tutor with iTutorGroup. The EEOC alleged that, in using this software, iTutorGroup automatically rejected over 200 job applicants in 2020 solely due to age.
The iTutorGroup lawsuit is part of the broader EEOC push (as reflected by the recent EEOC guidance on the use of AI) to target and eliminate hiring practices that, among other things, rely on AI tools or machine learning that intentionally exclude or adversely impact protected groups. While this settlement is the first where the EEOC has settled with a company accused of using AI tools that discriminate against applicants in hiring, as companies around the world increase their use of AI in hiring and to support HR-related activities, we expect an increasing number of lawsuits targeting AI hiring bias are on the horizon — whether brought by agencies such as the EEOC or by individual employees via private counsel.
As AI continues to evolve, companies increasingly utilize these tools in all aspects of human resources. According to a recent survey of HR professionals by the Society for Human Resource Management, nearly eighty percent of companies surveyed use some kind of automation or AI in recruitment and hiring. The iTutorGroup lawsuit and settlement should serve as a cautionary tale of the dangers of relying on AI tools in making hiring decisions. Though AI tools continue to gain traction in the HR world, it remains incumbent upon organizations to ensure that their hiring practices adhere to existing employment laws and that any AI tools used in the hiring process comply with existing law. This includes ensuring that disparate impact analyses are conducted on AI-involved employment decisions.
We will continue to monitor developments regarding EEOC action on bias in AI-based recruitment, screening, and hiring. In the meantime, please contact your Foley & Lardner Labor & Employment attorney with questions.