Have you ever carefully crafted a job application for a position you’re sure you’re a perfect fit for, but never received a response? It’s highly likely that no one saw his application, even if he followed the Internet’s advice to copy and paste all the skills from the job description.
Employers, especially large companies, are increasingly using artificial intelligence (AI) tools to rapidly narrow down applicants to short lists to help make hiring decisions. One of the most widely used artificial intelligence tools is an applicant tracking system (ATS), which can filter and rank candidate resumes based on an employer’s criteria before a human recruiter searches for the best matches.
And systems are getting smarter: Some AI companies claim their platforms can not only identify the most qualified candidate, but also predict which one is most likely to excel in a given role.
“The first thing workers need to understand is: no one is looking at your resume. You have to get past the AI gauntlet before they see you.” [by a recruiter]says Joseph Fuller, professor of management practice at Harvard Business School.
While AI recruiting tools can save companies time and money when all they want to do is fill a position, experts warn that the platforms can miss qualified candidates and even introduce new biases into recruitment processes. recruitment if not used carefully.
Meanwhile, human job seekers often don’t know exactly what AI tools are being used and how their algorithms work, leading to frustrated searches for advice on how to “beat” recruiting software – much of it just scratches the surface.
AI can ‘hide’ potential workers
Last year, Fuller co-authored research on “hidden workers,” applicants that companies overlook due to their hiring practices, including the use of AI tools.
The researchers interviewed more than 2,250 executives from the United States, the United Kingdom, and Germany. They found that more than 90 percent of companies used tools like ATS to initially screen and rank candidates.
But often they were not using it well. Candidates were sometimes qualified with inflated job descriptions filled with unnecessary and inflexible criteria, leaving some qualified candidates “hiding” under others that the software deemed more suitable.
Depending on how the AI was set up, it could rank or filter candidates based on factors such as a gap in their professional history or lack of a college degree, even when the position did not require a post-secondary education.
“Over 90 percent of companies just admit, ‘We know this process excludes qualified people,'” Fuller told CBC News.
Those overlooked candidates included immigrants, veterans, people with disabilities, caregivers and neurodiverse people, among others, he added.
The researchers urged employers to write new job descriptions and configure their AI to include candidates whose skills and experiences meet the basic requirements of a position, rather than excluding them based on other criteria.
The new hiring rules (IA)
The US government has issued guidance for employers on the potential of automated recruitment software to discriminate against candidates with disabilities — even when the AI claims to be “bias-free.”
And starting in April of this year, employers in New York City will have to inform candidates and employees when they use AI tools in hiring and promotion, and audit those technologies for bias.
While Canada’s federal government has its own policy on the use of AI, there are no rules or guidance for other employers, although legislation currently before Parliament would require creators and users of “high-impact” AI systems to take security measures. mitigation of damages and prejudices to mitigate damages. and bias (details about what is considered “high impact” AI have not yet been explained).
So for now, it’s up to employers and their hiring teams to understand how their AI software works and the potential downsides.
“I advise HR professionals who need to do their research and have open conversations with their vendors: ‘OK, so what’s in your system? What does the algorithm look like? … What is it tracking? What does it allow me to do? ?” said Pamela Lirio, associate professor of international human resource management at the Université de Montréal.
Lirio, who specializes in new technologies, says it’s also important to question who built the AI and what data it was trained on, pointing to the example of Amazon, which in 2018 removed its internal recruiting AI tool after finding it was wrong. biased against female job applicants.
The system had been trained on the resumes of previous applicants, who were mostly male, so the AI learned itself from lower-ranking applicants whose resumes mentioned competing in women’s sports leagues or graduating from women’s colleges. .
As AI gets smarter and more in tune with the types of candidates an employer likes, based on who they hired in the past, companies risk replicating Amazon’s mistake, says Susie Lindsay , adviser to the Ontario Law Commission which has investigated the possible regulation of AI in Canada.
“If you’re just going to use a recruiting tool to look at resumes, or even looking for a tool for your current employees to decide who to promote, and you’re looking at who’s been successful to date, you’re not… giving people a chance. people who don’t exactly fit that model to potentially move forward,” Lindsay said.
Can you really ‘beat’ hiring bots?
Do a web search for “how to beat ATS” and you’ll find thousands of results, including from professional resume writers and online tools offering advice to help you fill out your resume with the right keywords to get past the AI and get to the top. desk of a recruiter.
But keywords are just one of many data points used by increasingly advanced AI systems. Others include the names of companies he has worked for in the past, how far along he is in his career, and even how far away he lives from the organization he is applying to.
“With a proper AI system that is able to understand the context of the ability and the relationships between abilities, [keyword-stuffing] it’s just not as fruitful as it used to be,” says Morgan Llewellyn, chief data scientist at recruiting technology company Jobvite.
Instead of trying to fool the algorithm, experts recommend applying for jobs that match the skills, knowledge, and experience you actually have, keeping in mind that a human being always makes the final decision.
“Even if you put this keyword, it’s okay, well, what have you done? What was the function of your job, the title of the job you’ve done in the past?” says Robert Spilak, vice president of ATS provider TalentNest.
“You must meet the requirements [of the role]. If you don’t meet any of them, then by all means [a] human or some automation will filter you”.