Bias-Free Selection by a Computer Algorithm?
19 Jan, 2023 •
Artificial intelligence is called a great solution for good and efficient pre-screening within large numbers of resumes and candidates for years now. There is a bit of confusion around what AI can and cannot do since it is such a broad concept. But when it comes down to the war for talent AI plays a very specific role: to give more accurate and more efficient predictions of a candidate’s work-related behaviors and performance potential. In this way, it can also help to remove bias from the first steps of the selection process. It is important to use a clear definition of AI in selection. AI is, “Finding patterns that people often don’t see by analyzing large amounts of data.” These patterns are converted into algorithms that are then used to make decisions that make the selection process more objective (read with less bias) and faster.
Unlike traditional recruitment methods, AI is able to find patterns unseen by the human eye without being distracted by irrelevant background information. The use of tools based on AI brings many advantages in recruitment:
But watch out, it now seems that there are only benefits while there is still something to be said about the last 2 points. AI and algorithms can take the human (un)conscious assessment out of the selection process, which is positive. But you have to be alert here that AI models are usually a result of the data you put in (the training data set). We often see that these are data models of high performers or of other uniform groups of employees in the organization who share certain characteristics.
Based on this, a ‘model candidate’ is made with which the job applicants are compared. This gives a probabilistic estimate of the match between the candidate and the job. Theoretically, this sounds very promising. But if there is bias in your data or your training data set and the algorithms are not corrected for this, AI will only exacerbate the problem of bias in selection.
If you are going to predict football performance among 15-year-olds, you will see that the oldest boys in the selection – born in January, February, or March – are often the best football players. This is because boys of 15 years are still growing and the slightly older boys are physically superior and therefore seem to be better players. A beautiful book has been written about this phenomenon by Malcolm Gladwell; “Outliers, the Story of Success”. Bias suddenly takes the form of a birth month effect. It is therefore important that you try to avoid that your dataset used for the algorithm is too pure. It is not always transparent how an algorithm has been developed with which social acceptance is discussed. Candidates (rightly) wonder whether the criteria against which they have been set are correct. Who can reassure them that the algorithm does not select by age, for example?
What can you do to use AI in your selection process as well and objectively as possible?
Avoiding bias in the selection process is difficult, even if you leave it to a computer, you have to stay alert. By mapping as many unconscious biases as possible in advance and continuing to correct them, you reduce the chance of excluding a candidate based on (un)conscious bias. It remains important to continuously refresh your knowledge of new technology and AI in order to make the best choices for the recruitment policy.