Warning to recruiters: Your world is becoming a mess. First, some AI-digital screening software are being accused of discriminating against candidates based on their race, disability, and other factors.
Second, job applicants are using ChatGPT and other software to re-write their resumes making resumes look nearly all the same. Some job candidates use robo-application bots to apply hundreds of times in just seconds for jobs and software that actually answers automated job interview questions for the candidate. Companies are fighting back with more software and clever bot-tricking questions.
Workday complaint
In February, 2024 and as reported by Reuters, Workday, a large cloud-based HR Software company that that has contracts with 26 percent of the Fortune 500 companies, is facing renewed claims that its artificial tools discriminate against job applicants at many major companies it contracts with, in what could be one of the first cases to address the novel legal issues raised by employers’ increasing reliance on AI-powered hiring software.
Derek Mobley, who says he has been turned down for more than 100 jobs he applied for using Workday’s platform, filed an amended complaint, opens new tab in San Francisco federal court on Tuesday after U.S. District Judge Rita Lin dismissed his original lawsuit last month. Mobley, who is Black, over 40 years old and has anxiety and depression, says in the new complaint that by using Workday’s platform, employers are essentially handing over their authority to make hiring decisions to the company.
“Because there are no guardrails to regulate Workday’s conduct, the algorithmic decision-making tools it utilizes to screen out applicants provide a ready mechanism for discrimination,” Mobley’s lawyers wrote in the complaint.
The company has denied wrongdoing and said when the lawsuit was filed that it engages in an ongoing “risk-based review process” to ensure that its products comply with applicable laws.
ACLU filed complaint against Aon
On May 30, the American Civil Liberties Union filed a complaint with the Federal Trace Commission (FTC) against Aon, a major hiring technology vendor, for using products that the ACLU alleges discriminate and are in violation of consumer protection laws.
The ACLU complaint alleges that two Aon products, a “personality” assessment test and an automated video interviewing tool measure general personality traits, such as positivity, emotional awareness, and liveliness, that are not always job related or necessary for most jobs. And in doing so, the ACLU says the tests penalizes people who may have Autism Spectrum Disorder mental health conditions, such as depression and anxiety.
The ACLU alleges that the personality test is marketed to employers across industries as cost-effective, efficient, and less discriminatory than traditional methods of assessing workers and applicants across many job roles.
The ACLU alleges that this creates a high risk that qualified workers with these disabilities will be disadvantaged compared to other workers and may be unfairly and illegally screened out. The automated features of these tools exacerbate these fundamental problems, according to the ACLU, particularly as Aon incorporated artificial intelligence elements in its video interviewing tool that are also likely to discriminate based on disability, race, and other protected characteristics.
It will be interesting to see the rulings with these two AI-screening cases. These cases remind me of an earlier round of AI-based bias. In 2020, Amazon famously had to scrap its AI recruiting bot when the company discovered it was biased against women. Another study in 2017 found that discrimination is so prevalent that minorities often actively “whiten” resumes and are subsequently more successful in the job market.
The mess with digital technology: bot versus bot
Lindsay Ellis of the Wall Street Journal reports that job seekers frustrated with AI screening software are using AI to re-write their cover letters and resumes and using new automated bots to robo-apply for hundreds of jobs in just seconds. For example, an AI job-hunting tool called Sonara, for $80 a month, finds jobs, answers questions as though they were the candidate, and applies to as many as 370 positions for a job candidate each month. Some job openings now draw about 1,200 applicants in 2 hours. Another result is that many applicants use phrases and sentences that are almost identical to each other.
In response, companies are deploying more AI screening tools and are setting up bobby traps for bots. One of the booby traps is to trick a bot into answering a question that is not required. The trick begins with this sentence, “If you are reading this, awesome – do not answer this question and hit OK to move on to the rest of questions.” The bots answered the question. The human reader moved to the next question.
The result of all of this Ellis writes, “is a bot versus bot war that’s leaving both applicants and employers irritated and has made the chances of landing an interview, much less a job, even slimmer than before.”
AI can be helpful in recruiting
Numerous surveys have found that roughly 80 percent of U.S. employers, and virtually all Fortune 500 companies, use AI in the hiring process. That includes using software made by Workday and other firms that can review large numbers of job applications and screen out applicants for a variety of reasons.
Government agencies and worker advocates have expressed concerns that AI tools can discriminate against job applicants when they are built using data that reflects existing biases. The Equal Employment Opportunity Commission, which enforces laws banning workplace bias, has warned employers, that they can be held legally liable if they fail to prevent screening software from having a discriminatory impact.
Companies have used applicant tracking systems (ATS) for decades to screen applicants for the skills, applicable tools, education, and experience required for jobs. ATS can also accelerate the communications with employees by automatically generating rejection letters and scheduling interviews. The more advanced ATS can even recommend candidates apply for other jobs that they appear to be qualified for.
It is when ATS cross the bridge from screening for objective factors to using personality assessments, such as in the case that is alleged with AON, that they can get into trouble.
Even without the biases of poorly developed personality assessments and ATS, Blacks face discrimination from recruiters and hiring managers. Economists from the University of California, Berkeley, and the University of Chicago in 2021 released the results of a discrimination audit of 108 companies. The audit revealed that entry-level applications from candidates with a “Black name” get fewer callbacks than similar applications bearing a “white name.” The resumes were identical except for the names. The good news is that the researchers found that employers with centralized human resources handling job applications tend to discriminate less, suggesting that uniform procedures can help reduce racial and gender bias.
How to reduce human and AI-based biases
Below are seven steps recruiters and companies can take to reduce bias and the recruiting mess. The first three address the bot versus bot wars. The last four address biases the interviewer may have.
Number 1. Validated Assessments
Only use validated assessments, meaning the vendor has conducted a statistical analysis that shows it does not have biases based on gender, race, ethnicity, disability, and other protected classes. If an assessment vendor cannot provide you with this data in a white paper, move on to another.
Number 2. Run the numbers looking for bias
Periodically, you will need to go through your affirmative action plan to determine that there is no disproportionate screening out of protected classes and then make adjustments.
Number 3, Determine if you need to put in place job applicant booby traps
As noted above, you will need to watch volume surges of applicants per job to your ATS and determine if you need to plant booby traps to screen out job applicant bots that answer questions for job candidates.
Number 4. Project-based assessments. Michael Li writing in the Harvard Business Review recommends that companies deemphasize traditional resume screening and go to project-based assessments which are more fair and accurate: Li writes:
Smart companies are starting to embrace more objective interviewing techniques, Li observes. Chief among these are project-based assessments. While the exact parameters vary, project-based assessments in AI and data science typically ask a candidate to clean and analyze some real-world data and write a short report of their findings. Some are more directed assessments, Li writes, while others are more open-ended. Some are take-home, while others are administered during an onsite interview. Regardless of their style, they ask candidates to demonstrate their own abilities, rather than just claim them.
I have used project-based assessments in hiring sales and marketing leaders. With the former, we gave candidates a real-world situation of a poor performing sales region, complete with facts and stories, and then asked them for a 90- and 180-day plan to turn around the sales territory. With marketing, we gave candidates data and then asked them to create a marketing presentation and present it to the hiring team, who would play prospective customers.
I have other suggestions for companies wanting to take bias out of their recruiting.
Number 5. Look at your job postings. Your job postings may be your worst nightmare, scaring off job applicants with jargon and needless clutter and driving away female applicants. If your job posting is more than 250 words and uses gender biased wording, it is time for an overhaul. Learn more here.
Number 6 Use structured interviews. Despite 100 years of empirical evidence (yes, 100 is correct.), many companies do not put the time and effort into setting up effective interviewing techniques to improve their ability to hire top talent significantly. Instead, they rely on casual, poorly prepared interviews (academic researchers call them “unstructured interviews”) that are heavily affected by first impressions and unconscious bias. Not a good way to hire.
A structured interview is based on the knowledge, skills, abilities, and competencies to perform on the job and align with company values. With structured interviews, the company creates questions to ask each candidate to determine if they have the education, experience, technical skills, social skills, and emotional intelligence to do well on the job. They can significantly remove bias from interviewing. Learn more here.
Number 7. Use blind assessments and auditions: AI technology exists to find job candidates based on objective criteria and blinds candidate identifies, race, ethnicity and gender to prevent human bias during resume screening. AI powered tools, such as ThisWay Global, can find job candidates on the internet using objective criteria and present the top 200 job candidates blinded from the recruiter, so the recruiter cannot tell their race, ethnicity, or gender, enabling the recruiter to choose candidates based on matching job criteria, similar to the orchestra example above. ThisWay Global has trained its AI to remove bias so you can benefit from a diverse talent pool. Learn more here.
An unlikely parallel exists in — of all places — the field of classical music. In the 1970s and 1980s, historically male-dominated orchestras began changing their procedures for hiring. Auditions were conducted blind — placing a screen between the candidate and their judging committee so that the identity of the auditioner could not be discerned — only their music was being judged. The effects of this change were astounding: Harvard researchers found that women were passing 1.6 times more in blind auditions than in non-blind ones, and the number of female players in the orchestras increased by 20 to 30 percentage points.
Recruiting has become a mess, not that it ever was straight forward, without bias, friendly, and error free—but AI technology has muddied the waters more. Sadly, companies that want to hire great candidates without bias, and job candidates looking to find their ideal employers must navigate a digital minefield to get to a hopefully unbiased interview.
About Victor
Victor Assad is the CEO of Victor Assad Strategic Human Resources Consulting and Managing Partner of InnovationOne, LLC. He works with organizations to transform HR and recruiting, implement remote work, and develop extraordinary leaders, teams, and innovation cultures. He is the author of the highly acclaimed book, Hack Recruiting: The Best of Empirical Research, Method and Process, and Digitization. He is quoted in business journals such as The Wall Street Journal, Workforce Management, and CEO Magazine. Victor has partnered with The Conference Board on innovation research. Subscribe to his weekly blogs at http://www.VictorHRConsultant.com
