Using computer programs to screen job candidates’ resumes is not a new HR practice among big corporates. AI accelerates the automation process for all companies in the selection process. More companies are using AI to evaluate job candidates’ performance in their initial interviews, such as word choice and cognitive abilities, in addition to screening their resumes.
AI speeds up the hiring process
AI can help companies speed up the hiring process because they usually receive hundreds or even thousands of job applications for a vacancy. It becomes unrealistic for companies to solely rely on “human eyes” to screen all resumes or perform as many live job interviews as AI would do.
Research and industry reports have widely documented HR managers’ biases in their hiring practices. Now, when AI is helping HR managers make decisions, do companies still need to be mindful of potential biases during the hiring process?
Understanding AI biases in hiring
Firms providing such AI screening services would usually claim their algorithms would eliminate human biases because machines tend to focus on other factors of the job candidates than their demographics. Still, AI is not perfect or free of biases, including
- Data bias: Because AI relies on historical data to train its predictive model, it could generate bias recommendations if the training datasets include biased information, such as favoring specific schools or gender.
- Algorithmic bias: When AI is trained to favor specific job titles or experiences that are more common in certain demographics, other groups might be screened out even if they are also qualified.
- Unfair assessment: This could happen to candidates who took time off to take care of their family members, including a maternity leave. AI might not be able to understand the context when spotting a gap between employment.
- Feedback: If any of the above biases are not corrected immediately, the AI tool will only become more biased in selecting candidates in the future.
Methods to avoid AI biases
To mitigate AI biases in the hiring process, companies must first use a diverse dataset to train the AI tools. Companies should keep the above potential biases in mind and reinforce ethical guidelines to oversee the automation process. Designing an inclusive AI will help. Lastly, companies must regularly audit AI tools and hiring processes with real human managers.
Get ready for an EEOC audit
It is unclear when or how EEOC would start auditing companies’ AI hiring practices. However, maintaining all records and being transparent in the use of unbiased algorithms can be helpful. Transparency is key to building trust and ensuring the fairness of the hiring process.
What do those AI biases mean to job candidates?
AI has created a game-changing effect on how job seekers can secure a job offer. It deserves a more thorough discussion. Stay tuned for my next viewpoint on this topic.
Are you more excited or nervous about seeing more companies using AI in screening candidates? What suggestions will you make to companies already using AI in hiring?
In reflecting on the article regarding the role of AI in hiring new employees, I firmly believe that AI should not be solely responsible for this critical process. While I acknowledge the advantages of utilizing AI, such as efficiency and data-driven insights, there are significant aspects of hiring that cannot be quantified. As Mr. Kwok points out, the use of artificial intelligence in recruitment comes with both positives and negatives. AI's decision-making relies heavily on historical data, algorithms, and a level of accountability that, while valuable, often overlooks the emotional intelligence integral to hiring. Hiring is not just about matching skills and qualifications; it also involves understanding a candidate's personality, potential cultural fit, and interpersonal dynamics. I wonder if there will come a time when AI can evolve to incorporate emotional and personality factors into its assessments? - Michelle Hopkins Oct. 21, 2024
ReplyDeleteUpon reading this article, I entered being weary of how complex the use of AI would be when screening candidates, and this article solidifies that. AI can be very beneficial when it comes to selecting candidates that have skills aligning with those needed for the open positions. However, through the article it is noted how easily an AI service could develop its own sense of bias from the data alone. Which is interesting to think about the fact that even algorithms and solid data could create a bias. AI screening services eliminate human errors, such as similarity or halo effect, but in exchange there is a lack of human perspective in my opinion. rounds out the hiring process. It is important to note that AI tools must be tasked specifically to get the best results from the data inserted. If AI is to be ingrained in the hiring process of more companies, they should keep in mind that AI needs up keeping as well, conducting audits on the software should be frequent. This brings to my thoughts concerns, how many companies that have begun to incorporate AI screening to their hiring procedure have gone back to ensure there is lack of bias? Alongside that, is it possible that the AI screening process will limit the amount of early career opportunities through its skill bias?
ReplyDelete-Chelsie Vasquez Salazar
Nov. 11th, 2024,
3:06 PM
This Articles offers a compelling question about the ethical implications that AI has within recruitment as streamlines hiring by processing large numbers of applications and interviews as its fitting into narrative of being biased . Companies should critically analyze the algorithms and data that make decisions as this remains a significant problem within hiring practices. This fits into the narrative how technological advancements must coincide with fairness and ethical responsibility as inclusion is a vital part of a workplace environment , how its fitting into the idea that AI will slowly limit our capabilities and opportunities as AI will determine every aspect of a workplace environment. This article provides insight how organization should analyze there algorithms as as they have to ensure that standards and procedures are being done efficiently to ensure there is fairness within hiring practices.
ReplyDeleteMarco Mena
Nov.18th , 2024 ,
12:20 PM
After reading the article, I am a little less excited in seeing companies use AI to screen candidates. As mentioned in the article, if an AI is not carefully constructed, then the information can lead to the AI being bias. Unfortunately, if the company is not careful, people may actually agree with this bias and reenforce it. Such as, if the person doing the audit were to prefer certain experiences or schools. I recommend using a third-party to audit the AI, incase of tampering by certain people. It would also be good to not rush or underfund the development of the AI. These actions may lead to the AI being sloppy or clearly bias. This would only lead to having to use more resources to polish the AI, and actually get it working as intended.
ReplyDeleteI mostly agree with your input on AI taking over the hiring process in job interviews now. However as a candidate myself I would have to argue that they should limit the use of AI in businesses. But first I can see how efficient the processing could be. For example if it was looking for a specific candidate and has specific requirements like being bilingual or having a high school degree, could be an easy hire because they meet the criteria. However, it would take much more effort to program and build an AI that could consider diversity. I also feel that there are certain interview questions that artificial intelligence cannot relate to. Such as personal experience questions. Therefore, I would have to disagree with the use of AI.
ReplyDeleteI agree that AI can assist employers when searching for the best-fit candidates, especially since the applicant pools have grown larger and more competitive post-pandemic. However, there must be a balance between AI and humans overseeing the application process as the labor market now has larger gaps and labor shortages, posing challenges for unemployed individuals actively seeking work. Applicant tracking systems often hold biases against certain applicants without considering the various situations that might have contributed to a "negative" in their resume due to the AI's history of trend detections. As I approach graduation and being applying for jobs that utilize ATS, I grow concerned about these biases as they are layered challenges in today's market. Overall, I believe balancing AI and human reasoning is crucial in the application process and providing accessibility to work.
ReplyDelete