Return to site

Candidate Discrimination: Talent Acquisition in the Age of Algorithms

I speak with individuals born in the U.S. and around the globe who often come from cultures that are different from the mainstream, or what is typically viewed as the perceived norm, here in the U.S.  Some speak multiple languages but as often as not, English is not their native tongue. When I read their resumes, translated from their native tongues to English, sometimes key words and phrases are expressed differently based on the grammar and/or idioms found in their native language. Sometimes, the translation doesn’t quite work in English and there is a possibility for key words or phrases to be misinterpreted, judged by an ATS system’s algorithms as undesirable and therefore drastically reducing their odds of being invited to interview for a company’s open positions.

Algorithms used by Applicant Tracking Systems (ATS) can unintentionally discriminate when it comes to candidate recruitment. I’m not a mathematician but even I know that algorithms are not innocent. Algorithms can be written and manipulated in a myriad of ways to achieve a desired outcome. Written by humans, algorithms are always attached to belief systems with all the shortcomings, opinions, morality, fears and insecurities of their creators.  Need an example? Look no further than the unexpected results (according to various polls using different algorithms) of the 2012 presidential election.

Algorithm architects, also known as data scientists, write algorithms for all sorts of tasks and activities. Here’s an example that might hit home: a few years ago, Netflix awarded a prize of one million dollars in an algorithm optimization contest to the BellKor team of research engineers out of AT&T Research. Netflix sponsored the contest to improve the company’s movie recommendation algorithms. Mad about a favorite movie removed from the list of available Netflix movies? Blame BellKor for creating the “Pragmatic Chaos” theory used by Netflix to recommend movies for your viewing pleasure.

Here’s how Applicant Tracking Systems discriminate against prospective candidates: the algorithms embedded in ATS systems search for key words and written language patterns to score desirability, by comparing a prospective candidate’s scores against what is perceived to be desirable by the coders involved in populating the list of key words and phrases. Score low enough and a qualified candidate mistakenly gets knocked out of contention for a job.

Candidate discrimination may rear its ugly head if a large majority of the coders and data scientists come from the same age group, ethnicity and/or gender without adequate representation from other age groups, ethnicities, or genders. Substituting the phrase “key word” for “language”, Archbishop Desmond Tutu said it best: “Language is very powerful. It does not just describe reality; language creates the reality it describes.”

How do we ensure that a majority of these diverse, qualified candidates are not discriminated against, even if unintentionally? I have two suggestions. Make it a goal of your organization to hire or train data scientists and coders from diverse age groups, ethnicities, genders and nationalities to write code and come up with the algorithms used. Another solution would be to stop relying solely on Applicant Tracking System scores to assess prospective candidates.