Big ideas on Technology

Artificial Intelligence In The Hiring Process; Are Women Disadvantaged?

SHARE
Rena Nigam, CEO & Founder of Meytier

Is technology-introduced bias disproportionately impacting women in the job search process?

From sourcing candidates, to screening resumes, and even interviewing and onboarding, companies around the globe leverage technology to hire. While these technologies may save organizations time, it isn't clear whether some candidates are more impacted than others. Is hiring technology negatively impacting diversity in the workplace?

Rena Nigam, CEO & Founder of Meytier

Is technology-introduced bias disproportionately impacting women in the job search process?

From sourcing candidates, to screening resumes, and even interviewing and onboarding, companies around the globe leverage technology to hire. While these technologies may save organizations time, it isn't clear whether some candidates are more impacted than others. Is hiring technology negatively impacting diversity in the workplace?

"Artificial Intelligence and other technology cannot transcend human bias, it will simply recreate it in new ways unless the makers of said technology take measurable steps to change that."

Countless companies and organizations globally are integrating artificial intelligence programs into their business in some way. Whether it be candidate sourcing, hiring remote work, facial expression analysis, or diversity hiring… AI is alive and well within the hiring process. Many believe wholeheartedly that these computer systems will fill the gap human bias and error and make the hiring process easier, more efficient, and more equitable. This is not necessarily true. Though they are technology- computers, algorithms, and AI are all man-made. Just like anything else, an AI system will only be as sound, as, fair, and as unbiased as the person who built it. So what does this mean for candidates looking for a job?


Perhaps the most common use of technology in the hiring process is Keyword matching, whereby a computer system matches key phrases and words from a resumé to a job description and picks out the candidates with the most overlap. There are countless resources online ranking the most important keywords to include in your resumé and how to integrate them. This is obviously a flawed system- as it wouldn’t necessarily pick out the strongest candidates but rather the ones that had beefed up their resumé with enough overlapping language from the job description. Many believed that integrating machine learning programs into these programs could solve this issue. AI systems have been built to “grade” resumés off of various parameters and rank, match, and choose accordingly. In fact, some went so far as to predict that the computer hiring system would eventually be of a higher quality than a human hiring system, as it would accommodate for implicit human biases in hiring and develop a “fairer” view of the candidate pool. Let’s look at Amazon.


Machine learning specialists at Amazon set out to build a hiring model that would filter through their candidate pool and pick out the best applicants. They set out to do this by giving the machine data from the past ten years of resumés, allowing it to uncover patterns and traits amongst “strong” candidates. In theory, this was a good idea but in practice, it only proved to show that technology is a historically male dominated field and identified models and markers of strength and success would inevitably be traditionally male traits. According to sources at Reuters, after sifting through dozens of overwhelmingly men’s resumés Amazon’s system taught itself to penalize women applicants. The technology began to favor applicants who used words like “captured” and “executed” (power language that is traditionally found in men’s resumés) and to automatically downgrade applications that had female gendered details (such as women’s colleges, “women’s basketball team”, etc.). Although Amazon maintains that it never solely relied on this program to hire- it simply stands as an example of the limitations and dangers of over automating a process intended to hire humans.


Many studies have indicated that there are fewer responses for women applicants. With AI driving selection, the results are inordinately influenced by the fact that women are under-represented in many functions and industries. Algorithms tend to replicate historical biases and those introduced by the preferences of the hiring managers.


Artificial Intelligence and other technology cannot transcend human bias, it will simply recreate it in new ways unless the makers of said technology take measurable steps to change that. Men and women tend to represent themselves differently linguistically in their resumés, men usually have shorter resumés and will use more “powerful” language than women, using words like “create”, “founded” etc. while women use more collaborative language, like “helped”, “co-founded”, etc. To take a history of resumés from a male dominated field and workforce and mark them on supposed areas of strength will always put women in a less favorable position as their resumés will just not necessarily follow the same trends and guidelines as the resumés of their male peers.


In addition to all of this, Artificial Intelligence research is another piece of technology that is deeply male-dominated. One study showed that the number of women in machine learning research was as low as 12%. Artificial intelligence that evaluates, judges, and identifies candidates can never be unbiased if the creators of such technology are a homogenous group.


Women already get hired at lower rates than men (despite earning more bachelor's degrees than them) and the gender gap and pay gap increases as a woman progresses through her career. If women’s resumés are getting counted out before a human gets to look at them- this problem will only grow. Companies often struggle to hire women and perpetuate the myth that simply not enough women are interested or applying for technology jobs- when it is often their own hiring systems that count out women’s resumés before anyone even looks at them.


Companies are made of humans, and they are trying to hire more humans. While AI can be an incredible tool to help deal with the massive quantities of applications coming in and to help filter candidates on unique parameters- it can only be a piece of the puzzle when it comes to evolving the way companies hire.


About The Author

Rena Nigam is the Founder of Meytier. She can be reached at rena.nigam@meytier.com


Learn how Meytier uses LLMs to streamline the hiring process

How will ChatGPT impact recruitment?


© 2024 Meytier - All Rights Reserved.
   Privacy Policy    Terms Of Use