AI Hiring Discrimination: The Dangers of Automated Recruiting

AI-Based Hiring Discrimination and Implicit Bias

The Equal Employment Opportunity Commission (“EEOC”) recently reported that approximately 80% of employers are using Artificial Intelligence (“AI”) tools in the recruitment process. These AI tools range from chatbots that contact potential applicants to facial and speech analysis software used during interviews. Companies report that the tools have assisted them in more quickly identifying and hiring suitable candidates. However, using AI in the hiring process has raised legal concerns, particularly where the software relies on algorithmic decision-making that adversely impacts members of a protected class in violation of state and federal anti-discrimination laws.

AI Hiring Tools and Implicit Bias

Although many vendors of AI tools advertise as being bias-free, the algorithms used by these systems are only as good as the programmers who design them and input data. In other words, if the data fed into the system comes from a company with subconscious biases and preferences in its hiring practices, these biases may be reflected in the AI tool resulting in hiring discrimination.

For instance, the American Bar Association (“ABA”) reported that Amazon allegedly withdrew an internally developed recruiting tool after it discovered that the algorithm was disfavoring resumes that included the word “women’s” (e.g., an applicant whose resume stated she had participated in a college women’s swim team). Amazon discovered that this had occurred because the algorithm had been fed resumes from applicants who Amazon had previously hired and that those hires were overwhelming male.

AI Hiring Tools May Negatively Impact Disabled Applicants and Employees In Violation of the ADA and FEHA

There is also a growing concern that AI tools disproportionately negatively impact disabled workers. As such, the EEOC issued guidance on the use of AI tools in the hiring process and reported three main ways in which AI tools may discriminate against disabled workers in violation of the Americans with Disabilities Act (“ADA”) and the Fair Employment and Housing Act (“FEHA”):

1. Reasonable Accommodations: The ADA and FEHA require employers to provide job applicants and employees reasonable accommodations. A reasonable accommodation is a change in the way things are done. For example, a job applicant with limited hand movement due to a disability may have difficulty completing an AI assessment that requires manual input on a keyboard. Under these circumstances, the employer must provide an accessible version of the test as a reasonable accommodation, such as allowing the applicant to respond orally.

2. Screen Out: Screen out occurs when an AI tool gives a disabled applicant a lower score and/or rejects a job applicant because of their disability. For instance, an employer that uses a chatbot to communicate with potential applicants may be programmed to reject an applicant with a gap in their employment history, even if the gap in employment was to recover or receive treatment for a disability. Similarly, employers are increasingly using video interviewing software that analyzes speech patterns but cannot accurately score an applicant with a speech impediment. As such, the chatbot and interviewing software would screen out individuals because of their disabilities resulting in hiring discrimination.

3. Disability-Related Inquiries: The ADA and FEHA prohibit employers from making disability-related inquiries (questions designed to elicit information about an individual’s disability) during the hiring process. As such, AI tools should not ask questions such as “are you on prescription medications” or “do you have any physical impairments.”

Recommendations for Workers Who Need Accommodations or Believe AI Tools Have Unfairly Rated Them

If you believe that your medical condition could negatively impact the results of an evaluation performed by an AI hiring tool, you should inform the prospective employer or the software vendor about your medical condition and the need for accommodation. Similarly, if you receive a poor rating which you believe is due to a disability or other protected characteristic, you should contact the employer or vendor and ask for a reassessment. You may also contact an employment lawyer to evaluate your case and/or file a complaint with the Civil Rights Department.

Conclusion

For more information about hiring discrimination, please look at my practice areas page titled Disability Discrimination.