Job SearchA Computer is Reading Your Resume; Get it to Notice You Rose Walsh 

Meytier's mission is to leverage AI and Analytics to expand opportunities for women across industries. Our company was born from a realisation that current models of hiring disproportionately filter out female candidates and perpetuate the notion that there simply isn’t a large enough pipeline of female candidates. As we’ve built and scaled Meytier, we’ve researched current hiring models extensively and found that hiring in 2020 is complicated and more technical than ever before. Nearly every company uses some kind of technology to filter candidates and help them sift through resumes. This new wave of hiring technology has been proclaimed as a possible solution to end human bias in the hiring process and allow firms to hire fairly without expending substantial resources on Human Resource Departments. However, research has consistently shown that these hiring software solutions can not only replicate human bias, but often amplify it. This bias has disproportionately affected women for two main reasons: they are not represented well in the data used to make these software, and they are not represented in the teams and companies producing the software.

What does it mean for AI to amplify bias? In 2017, researchers at the University of Virginia used images to train an image recognition software after noticing some potential bias. They used photos that showed individuals in kitchens and represented more women than men. In the data, women were 33% more likely to be pictured in kitchens. When asking the AI system to identify photos of women- it would consistently mislabel men in kitchens as women and more than doubled the bias. The photos were 68% more likely to be labelled as women, even if they were images of men cooking. While a different space than hiring: this shows just what is on the line while using unbalanced or unrepresentative data to an AI system. There isn’t just a risk of recreating the same human bias, but actually making it much worse.

Most companies use some kind of keyword matching software to analyse candidates and determine a match. These are not necessarily the most sophisticated systems- they simply match the words of the job application to the words of a resume. Our research (and others’) has shown us that women tend to leave key skills and information out of their resumes, and often have more qualitative resumes rather than quantitative ones that might be harder for the software to analyse. For example, oftentimes keyword matching software wouldn’t match “product manager” with “managed product development”. These screenings also often can’t read resumes if they are not in some standard format (usually word or pdfs), leading many resumes unread or read incorrectly.

A slightly more sophisticated version of the keyword matching is also very common- usually an integration of machine learning into the resume analysis in which resumes are “graded” on certain parameters. Often, these systems are trained to analyse language strength, key experience, and more. In 2018, Amazon built themselves one of these programs- hoping that if they trained it with a database of Amazon’s strongest employees’ resumes, they could create a perfect hiring machine- in which they would submit resumes, the AI would suggest the top 5, and they would hire them. However, by training the AI model off of hundreds of mostly men’s resumes, the system taught itself to discriminate against women. Remember the photos of women in kitchens. Resumes that included traditionally women’s colleges were downgraded, “captain of basketball team” would positively affect one resume’s score, but “captain of women’s basketball team” would negatively impact another’s, because the system had largely been trained on and exposed to men’s resumes. While one doesn’t necessarily write their gender on a resume, there are proxies for it, like education, names, and language.

These systems also bring to light some of the key differences in the ways men and women represent themselves professionally. Women face huge bias when they come off as too confident, ambitious, or overstated. Thus, their resumes tend to be much more subtle. Women often use more collective language, highlighting the teams that surrounded them in a role, speaking about projects they “helped” with, times they “learned”. Male applicants say “executed”, “led”, and “innovated” where women applicants say “helped”, “was part of a team”, “learned.”. If you train an AI system to find common strengths amongst applicants in a field that has been traditionally overwhelmingly male dominated, it will inevitably identify traditionally male traits.

Most other systems are not much better. Video interviewing systems (that analyse candidates facial movements, language patterns, and more) are often trained off of homogenous data sets. Research has shown that facial recognition software struggles to properly identify the gender of non-white subjects. In addition to that, research as recently as this year has shown that there is still a significant data gap in voice recognition software, leading it to misunderstand women at a higher rate than men. Which begs the question- if a software can’t accurately understand a candidate’s language, how can it properly judge their strength?

The good news is- when it comes to mitigating some of the effects of technology bias in the hiring process, there is plenty you can do. There are a few basics. Make your resume a simple format (pdf or word doc) and keep the layout fairly simple. Overstuffed resumes can be difficult for computers to read. Keep the section titles of your resume clear and traditional. For example, make your education experience just “Education” and not a more creative but less obvious title. If you’re ever in a situation where you know your resume is going to be read first by a human, feel free to switch it back. Use the terminology that the job description uses. If the job description says it’s looking for a “Python Expert” don’t write that you have “extensive python experience”, write that you’re a “python expert”. And if you’ve only taken a few coding classes but the job description says “coding experience a big plus”, just write in “coding experience”. At an interview, or later on in the job search process you can explain that you’ve taken a few classes and you’re learning, you don’t need to explain it immediately in your resume.

We consistently find that women will leave out key parts of their experience on their resume. They are often uncomfortable claiming skills unless they are masters at it, they leave out extra education experience (courses and seminars), and tend to highlight their team’s work above their own. It can be difficult to claim a skill before you’ve mastered it- but remember that you can learn skills, and that male applicants who took those same two coding classes you took after work are putting it on their resumes. Look through your resume and think about how you’re framing your role and experience. Are you saying that you “worked with your team” on a project? Could you instead say that you led it? Are you saying that you were “part of a team that analysed security systems in financial transactions”? Could you instead say “Analyst for Security Systems in Financial transactions?” It’s implied that you worked with others on a project, a hiring manager reading your resume isn’t assuming you did each thing in your career alone. Remember that your resume is to highlight your experience and expertise, don’t get too caught up perfectly representing the reality of your role.

The reality is that most people still find jobs through some kind of personal connection.
Last year we interviewed over 400 professionals and most of them found their last job through some kind of personal connection, 25.8% via referral from family, friend, or coworker, 15% through networking, 13% from a recruitment or search firm. Ultimately, the best use of your time while looking for a job is reaching out to who you know. Having a personal connection of some sort will always benefit you in the job search process as it’ll increase the chances that a human reads your resume first. Hiring technology is imperfect yet unfortunately widely used. The reality is- these days, your resume probably goes through several rounds of elimination before it reaches a person (if it ever even does). Keep your resume detailed, simple, and focused and remember who’s looking at it first. If you’re a leader in a company in charge of hiring, look into the processes you use. You may be missing out on good candidates.

Are you looking for a job right now? Feeling stuck? Make a Meytier account and reach out to us. We'd love to help you find your next job.

© 2020 Meytier - All Rights Reserved