Why Recruiting is Broken: AI & Automation in the Hiring Processes

ProfilePicture of Damien Filiatrault
Damien Filiatrault
Founder & CEO

Recruiting is broken. I don’t say that to be dramatic – I truly believe it. Through traditional job boards, it’s just too easy for candidates to apply for positions they may not be suited for – or even have the qualifications to do. The resulting avalanche of applications and resumes makes it incredibly difficult to separate high-quality candidates from poorly suited ones, and hiring a single candidate in this manner can take days or weeks longer than it should.

Table Of Contents

AI is now being touted as the solution to the challenge of sifting through mountains of applications.  But “modern” hiring approaches leveraging AI and automation technology have fallen disappointedly short. 

Automation and AI-based technologies promised to deliver a revolution in hiring processes, simplifying the process of identifying and evaluating candidates. Instead, companies that have become overly reliant on these tools have encountered a new swath of challenges and exacerbated existing, systemic ones. 

Bias in AI models has been staunchly demonstrated. ChatGPT writes job postings that are twice as biased as humans, thanks to a training dataset that favors men and little intervention on the part of humans to rectify it.

Table showing bias exhibited by ChatGPT vs humans in writing job descriptions

AI models used by Ziprecruiter, Career Builder, and LinkedIn have been shown to reinforce gender pay gaps and perpetuate stereotypes. 

All of this leaves us wondering: was hiring always so complicated? 

Of course not. And it doesn’t have to be. Hiring doesn’t need to be the arduous process it is with job boards. But neither should we expect it to be a fully automated process, where AI selects, interviews, and makes offers to candidates without an ounce of human intervention. 

Three Reasons AI Won’t Solve Today’s Hiring Challenges

Today’s AI tools aim to fix the problem of sifting through high numbers of resumes. Existing AI and automation technologies have been adapted to this problem in hiring, rather than being built around the foundational elements of hiring itself. Therein lies the problem.

Think about it for a moment: what’s the purpose of reviewing thousands of resumes, regardless of whether a human or AI does it? Is it really necessary in the first place? Once you meet a critical mass of quality candidates, that’s all that matters. You only need one person to fill a position. Might it not be better to try to have a fewer number of truly qualified candidates apply in the first place?

In many ways, these technologies exacerbate the “paradox of choice” problem for hiring managers. The abundance of applicants can make it difficult to find the best candidate, and lead to decision paralysis, inflated expectations, uncertainty, resource constraints, and even dissatisfaction with the final choice. Yet AI and automation in hiring has been widely adopted. And the trouble with it goes deeper than an abundance of choice.

Graphic showing that AI can exacerbate bias, depersonalize hiring, and limits transparency into decision making

AI Tools Exacerbate Existing Bias in Hiring

Numerous studies have demonstrated troubling outcomes when AI is relied on for hiring, with bias being a significant problem in AI models. As mentioned above, ChatGPT has been shown to exacerbate existing bias, just a handful of months after its launch. 

This issue is not new: AI models have long been known to be biased, primarily based on the data they are trained on. In 2021, MIT published findings demonstrating that AI interview tools are biased against women and non-white applicants. Many other findings echo this sentiment. 

In 2018, platforms like Ziprecruiter, Career Builder, and LinkedIn were found to show lower-paying jobs to female applicants and higher-paying ones to males, reinforcing gender pay gaps and perpetuating stereotypes about job suitability based on gender. 
The same year, Reuters reported that Amazon’s AI recruitment tool exhibited gender bias. The tool, which was used to review job applicants’ resumes and “score” each candidate, was trained on resumes predominantly submitted by men, leading it to favor male candidates.

How Amazon's AI recruiting system scores differently between male vs. female resumes.

Many more studies have demonstrated consistent findings, penalizing candidates’ based on facial expressions, voice, language patterns, gender, race, and disabilities. 

Leaders Lack Transparency Into how AI Tools Make Decisions

AI models often function as a “black box,” in that the technology’s decision-making process is unclear. Many companies will fail to invest the time to understand the data on which these models are trained or how they assess candidates. It’s even less likely that hiring managers are trained on how they work and potential issues to be aware of. 

This lack of transparency can lead to unintended consequences and undermine the effectiveness of AI-driven recruitment. Leveraging tools that are more than likely to exhibit bias without corrective measures can easily lead to outcomes that don’t align with hiring goals or company values. Moreover, the inability to justify or explain hiring choices to candidates, team members, or stakeholders due to a lack of understanding of the AI’s decision-making process can damage trust and the company’s reputation. At the extreme end, it also raises legal and ethical concerns, as using AI in hiring decisions requires compliance with relevant laws and ethical guidelines. 

Without understanding the AI model’s approach and selection criteria, hiring managers miss the opportunity to adjust and refine it. Over time, this can lead to a far less effective hiring process, and potentially teams that lack diversity of views and opinions.

Depersonalization and Loss of Human Touch

Using AI in recruitment can reduce candidates to mere data points, stripping away the human aspect of the hiring process. As AI algorithms primarily rely on data points to analyze and rank applicants, they often overlook the human aspects that contribute to a candidate’s overall suitability for a role, such as their personality, cultural fit, and soft skills.  

There’s real value in a hiring process being human. It’s through interaction and conversations that we identify candidates that are well suited to our teams. More than just being a qualitative loss, it can lead to hiring managers potentially overlooking candidates who might be a great fit, but didn’t meet the AI’s criteria. This depersonalization can also create a negative experience for candidates.

How and When to Use AI Tools in the Hiring Process

Leveraging software tools and automation in the hiring process doesn’t have to result in a loss of human touch. There are some areas that software is well suited for. During initial application, for instance, software can be used to filter out candidates who don’t meet the required skill set by gathering structured information. Additionally, software can help rank candidates based on their years of relevant experience, ensuring that the most qualified individuals are prioritized. 

Interviewing Candidates

In the interview stage, AI can generate lists of questions or even help create a technical test to further assess a candidate’s abilities. Throughout the process, AI can be an invaluable asset for handling administrative tasks, such as scheduling interviews, drafting emails, and managing candidate data. 

It’s important to remember, though, that integrating technology into your hiring process should always be done thoughtfully. Hiring leaders should be trained on the benefits as well as the limitations. Everyone involved in hiring should have a clear understanding of the risks, how to avoid them through, for example, careful prompt engineering, and what to look for in the AI technology’s outcome.

To use a practical example, consider assessing a shortlist of candidates generated by a software system. Humans should review the shortlist manually, paying particular attention when assessing nuanced human qualities that AI may overlook. When hiring managers examine candidate profiles, it allows them to gain a deeper understanding of a candidate’s personality, cultural fit, and other intangible qualities that contribute to their overall suitability for a role. This balanced approach ensures that the recruitment process considers both the technical qualifications of candidates and accounts for the human elements that play a vital role in creating a cohesive and effective workforce.

Working With a Hiring Partner With Skin in the Game

Working with experienced people instead of solely relying on AI means a more personalized and invested approach to the hiring process. Human recruiters have a personal stake in the outcome. They want to find you the best possible candidate. If you aren’t happy, you may dismiss the candidate you hire, and you more than likely won’t work with them again. There’s a clear incentive for them to do the best possible work on your behalf, and as a result, they work diligently to find the right candidate for each position.

Using Scalable Path as an example, every party – you (the hiring company), us, and the person you eventually hire – all have skin in the game. The hiring process involves real people engaging in meaningful conversations. Our role is to first understand your requirements, expectations, and goals, which helps us create a specific and tailored job description that attracts the right candidates. 

Freelancers have skin in the game, too. Those that want to apply for your job must create a detailed profile and answer questions that relate their experience to your position. They can’t just upload their resume and hit an apply button. To apply, they have to be genuinely interested; it takes time to fill out a profile, answer questions thoughtfully, and showcase why they’re the best person for the role. 

Unlike Scalable Path, though, many staffing companies have identified a profitable business opportunity in harnessing software to match businesses with candidates. It’s cheaper for them to screen applicants using a completely automated system that spits out a succinct list of options for a company. You can present candidates much faster if there’s no facetime – no video interviews, no personalization of the interview questions, no thoughtful assessment of cultural fit. But the speed and ease with which applicants can be automatically screened leads to many of these companies prioritizing volume over quality, with less regard for the success of individual placements. If they can attract a high number of clients with the promise of a speedy hire, placing an ill-suited candidate in a position isn’t as big of a concern. They can just secure another client, right? 

Final Thoughts

Hiring isn’t easy, and the options at our disposal – job boards or automation – miss the mark. AI is a powerful tool that can enhance various aspects of hiring. But it’s essential to remember that it’s just that—a tool. It shouldn’t drive hiring decisions. Humans still do a better job, especially when they have skin in the game. 

Originally published on May 30, 2023Last updated on Jun 7, 2023

Key Takeaways

How AI is used in recruitment?

AI tools aim to fix the problem of sifting through large quantities of resumes and CVs. These tools are being used to solve this hiring problem rather than looking at key foundational issues like how to attract high-quality candidates.

Is AI good for recruitment?

AI can improve the recruiting process by automating tasks like filtering candidates and scheduling interviews, as well as assisting with skill assessment and creating interview questions. However, it's crucial to combine AI with a human review to consider intangible qualities and maintain a balanced and effective recruitment process.

What are the issues with using AI for hiring?

Automation and AI-based technologies promise to revolutionize the hiring process by simplifying how candidate are identifed and evaluated. But there are some issues that AI tools are not able to solve:
-AI Tools Exacerbate Existing Bias in Hiring
-Leaders Lack Transparency Into how AI Tools Make Decisions
-Depersonalization and Loss of Human Touch

Looking to hire?

Join our newsletter

Join thousands of subscribers already getting our original articles about software design and development. You will not receive any spam, just great content once a month.

 

Read Next

Browse Our Blog