Talent Assessment > Bias in Recruitment
Gender. Race. Caste. You name it and people will find a way to be biased about it. Bias is part of the human psyche. It comes inbuilt, ingrained and imprinted within the human brain. Instead of furiously denying that you are biased, start recognizing the fact that you are, indeed biased, whether consciously or unconsciously.
Why am I ranting about bias, one may wonder. To tackle the concept of bias in recruitment in the tech space, I need to first get it out of my system. Get out the fact that I cannot abide by people who put a negative spin on everything. Get it out that I am biased about coffee. Get it out that I am biased, period.
Now I can take a deep breath and aim to answer the age-old question of ‘Recruiter vs bias: Who’s winning this war?’
Saying it out loud that bias exists is the first step. Not all biases are bad. We need to identify the biases that are unrelated to the task at hand, and give them a good kick in the shin. With that out of the way, we can get on with our job.
Staying neutral is either a myth or a robot. To some extent.
What is bias in recruitment?
Recruiters and hiring managers come across several candidates in a day during the recruitment process. It’s not hard to imagine the various ways in which their choices can be influenced when hiring an employee.
Be it the gender of the person, a fancy pedigree on a resume, or a varied ethnic culture, you are constantly making decisions based on first impressions.
In simpler terms, bias affects your ability to make decisions, either positively or negatively. When you start looking at criteria that are unrelated to the performance of an employee, you might make the wrong choice.
Why is all the talk about unconscious bias?
The year 2020 saw a wave of incidents that arose out of bias, be it George Floyd, the BLM movement to counteract it, the election results or the subsequent protests in the US. While systemic bias has been ingrained in the society for decades now, these incidents served as a catalyst for every sector to sit up and take note of their own unconscious biases.
Unconscious bias is more harmful than biases related to gender, ethnicity and race because most people don’t realize that they are unintentionally harboring prejudices.
We would like to believe that we are ethical and unbiased and good decision makers but sadly, that is not the case. There are stereotypes and biases simmering away under the surface and the sooner we bring it to light the better.
The representation of minorities in the tech industry has always been low. Facebook has an African-American workforce of 3.8%. Google stated that 5.9% of its employees are Latinos and 3.7% are African-Americans. These numbers are the same or dip even lower across all the big tech companies.
It’s the easiest thing for a recruiter to relate more warmly to a candidate from his alma mater than one who is not. Psychologically, this is known as affinity bias in recruitment. If the person shares values and traits similar to his predecessor, he is more likely to get the role. This is an example of similarity bias. A Yale University study showed that both male and female recruiters showed preference to male applicants and were willing to pay him $4000 more than the woman.
How to avoid the murky waters of bias in recruitment?
When stirred, the murky waters of prejudice and stereotypes only get murkier. It is highly encouraging for me to see that many tech recruiters don’t mind getting their feet muddied. Tech hiring is undergoing a radical shift with recruiters realizing that the academic pedigree of a developer or where one works from doesn’t matter.
We, at HackerEarth have put out a report on developer recruitment, which showed that recruiters are prioritizing geographic-unspecific hiring. They are moving towards blind hiring as shown by the dramatic increase in the use of our technical interview solution FaceCode.
It enables you to conduct structured interviews for all candidates where there are pre-determined scoring systems. FaceCode allows you to define evaluation parameters. They act as a scorecard for interviewers to manually assess candidate skills during the interview. This helps reduce discrimination issues or bias during hiring.
Recommended read: State of Developer Recruitment 2020
We should all take a leaf out of the Boston Symphony concept. I came across this concept while reading Claire Cain Miller’s article in the New York Times magazine. In the 1970’s, the Boston Symphony Orchestra decided to test the outcome of blind auditions. This was due to the fact that orchestras were dominated by white men. Musicians, both male and female auditioned behind screens so the judges had no idea who was auditioning. Once anonymity was embraced, the results were a lot more fairer than before. Women musicians became more likely hires than men (between 25-46%).
Adopt a blind resume evaluation process where contact and personal information about the candidate is removed. By making use of our application screening software, it is possible to keep the hiring process blind for a large part.
Also, implement a collaborative hiring process. People with different backgrounds and perspectives interview candidates together. This helps make the recruitment process as unbiased as possible.
The scales are tipping in favor of…
Recruiters attempting to stop bias in its tracks. The lengthy, heated discussions around hiring bias and its consequences lead me to believe that it is one step in the right direction. Instead of shying away from uncomfortable conversations around bias, people are trying to build a system devoid of bias.
At HackerEarth, we believe skills trump everything else and there is no room for bias. The stats shown in our study and the increasing numbers using our remote interviewing software, FaceCode reinstates my faith in people. They are willing to commit to change, bringing us a lot closer to winning the war against bias, than ever before.
Here's what you can do next
Check out FaceCode:
an intelligent coding interview tool