Skip links

Identifying and reducing bias in the technical recruiting process

Setting The Landscape
It’s been a tough couple of years when it comes to bias and diversity issues in the recruiting world. The diversity issue in Silicon Valley, tech’s American hub, has been well-documented for years. (And while we often contextualize it around gender, it’s also a large racial issue as well.)

One of the reasons that diversity hiring suffers in tech companies is because of the prevalence of numerous biases that can impact the technical recruiting process. It’s also one of the reasons hiring suffers more broadly, with typically only a 40-60% success rate.

This is a complicated path through biases and towards more effective hiring processes, because biases are often deeply ingrained in us psychologically — and also unconscious to us at some level. It requires a great degree of self-awareness to see these blind spots in our thinking, and that’s not always commonplace in human beings.

When we take our blinders off and look at creating a more equal playing field for employment, it’s important to understand how privilege impacts hiring. You may have seen this video or similar ones making the rounds on social media:

Types of Biases
There are dozens of different biases defined by psychology and social science over the years, but here are some of the more prevalent in technical hiring:

Halo Effect: If you are good at “A” (say, a specific type of coding), you will thus be good at “B” (another type) and “C” (project management). This happens when a recruiter likes a candidate in one area enough to assume competency and drive in all areas, without actually doing enough research on background to prove that.

Similarity attraction effect: Seeking out those similar to you, whether in terms of hobbies, sports played, educational background, hometown, style of dress, etc.

Confirmation bias: This happens when recruiters and hiring managers look more for information which confirms their existing beliefs — and ignore or discredit information which doesn’t back up those same existing beliefs. You see this play out a lot in interview processes, because the question set (if ill-defined) can be aimed at underscoring what the hiring manager is already feeling, as opposed to standardizing the set across candidates. In this way you can compare apples to oranges instead of back to apples.

Intuition/Trusted Gut: Using “gut feel” over legitimate candidate research and/or data points on those who have been successful in the role with your organization previously.

Conformity Bias: You’ve probably seen this experiment — people are shown an anchor line and three other lines. One is clearly the same size as the anchor, and two are very different. When asked which one is like the anchor line, paid participants say one of the two crazy answers. In most cases, the unpaid participant goes along with them — probably thinking there is something he is missing. This is the power of groupthink, and it comes into play often during hiring decisions. If everyone is adamant on “Candidate A,” other recruiters with legitimate arguments for “Candidate B” may simply fade away and side with “Candidate A.”

Beauty Bias: Similar to similarity attraction, but refers to optimal height/weight combos for females and males in terms of what could be described as “conventional beauty.” Think about this: only one President in history (Taft) was truly overweight, so this bias plays in largely to how we think about leadership.

What happens when biases harm recruiting?
Biases limit your employee pool by reducing it to those who clear all the pre-existing biases. That’s likely going to be a more limited team in terms of experience and scope, and there are any number of benefits to more diverse teams — including better communication, better customer relationships, and, well, more fiscal return. Consider some numbers from McKinsey on the impact of less-biased, more open and diverse hiring practices:

  • Companies in the top quartile for racial and ethnic diversity are 35 percent more likely to have financial returns above their respective national industry medians.
  • Companies in the top quartile for gender diversity are 15 percent more likely to have financial returns above their respective national industry medians.
  • Companies in the bottom quartile both for gender and for ethnicity and race are statistically less likely to achieve above-average financial returns than the average companies in the data set (that is, bottom-quartile companies are lagging rather than merely not leading).

Biases harm hiring. They harm the diversity of teams, both gender-wise and race-wise. Poor, limited choices can slow tech projects and decimate budgets.
But if biases are psychologically ingrained in us and we’re often unconscious to them, what can be done to reduce bias in tech hiring?

One Approach to Reducing Bias 
Essentially, you’ll need a “blind” approach to submitting candidates to hiring manager, but setting the stage to largely eliminate bias starts early in the recruiting road map, and here’s how it works:

  • Assign every candidate a candidate ID number, as opposed to their name/resume (Names are another huge source of bias — consider seeing a “James” vs. a “Trayvon,” for example)
  • Discuss the importance of certain factors with the hiring manager — education, experience in certain roles, skill levels with certain tools, etc.
  • Assign a numerical value to each factor, i.e. 10 points for local candidate, 7 points for willing to move, etc.
  • Score the candidates based on pre-established eligibility factors 
  • Present the top-scoring candidates to the hiring manager 
  • Only after making next round decisions do they receive actual resume, name, background information, etc.

If you execute this technical recruiting process consistently, you should begin reducing the biases inherent in hiring and get the best people in front of your hiring managers.

You can also use work sample tests, i.e. tasks relevant to what the candidate would do if they assumed the full-time job. Work sample tests can also be made blind and the performance scores can be given to the hiring manager without context for name or background. This is partially how we work at eTeki in terms of assessing technical skill level for hiring managers; we’re concerned with both quality performance and reduction in bias. We want to give you the best people. Shouldn’t that be the whole goal?

Now, there is an important caveat that needs to be mentioned here: there are certain managers in companies who might want their team to be very similar to them in terms of race, gender, background, etc. In these situations, which is called “homophily,” you need to push back on that manager. Homophily is not a positive for organizations. While it can make that team feel “like family” because of the similarities, it’s deadly to new viewpoints, the ability to pivot when market conditions shift, and many other crucial business factors. Almost every major piece of research in the last 15 years on diversity across teams shows that the more diverse teams perform better financially. If the bottom line is your goal (it is in most for-profit companies), you absolutely need diversity. If a specific hiring manager is trying to stack his/her team with similar folks, this requires a discussion.

What other ways have you seen or used to reduce bias in the technical recruiting process?

Amanda Cole

Amanda Cole, Vice President at eTeki
She has more than 15 years experience developing innovative programs staffed by non-traditional workforces including freelancers, paid & unpaid interns, boards of directors and skill-based volunteers. The largest of which generated an $18 million annual impact from contingent labor.