Bias-free selection by a computer algorithm?
19 Jan, 2023 •
In a series of 3 blog posts, we delve deeper into the phenomenon of bias in selection processes. Bias, and thus discrimination against applicants, is often unintentional but as a negative effect. Selecting completely without bias is very difficult, but you can take a number of measures.
Artificial intelligence wordt al jaren genoemd als oplossing om een goede efficiënte screening te maken uit grote aantallen CV’s. There is a bit confusion around what AI can and cannot do, since it is such a broad concept. But when it comes down to the war for talent AI plays a very specific role: to give more accurate and more efficient predictions of a candidate’s work-related behaviors and performance potential. In this way it can also help to remove bias from the first steps of the selection process. It is important to use a clear definition of AI in selection. AI is, “Finding patterns that people often don’t see by analyzing large amounts of data.” These patterns are converted into algorithms that are then used to make decisions that make the selection process more objective (read with less bias) and faster.
Unlike traditional recruitment methods, AI is able to find patterns unseen by the human eye without being distracted by irrelevant background information.
The benefits of AI in selection processes
The use of tools based on AI brings many advantages in recruitment:
But watch out, it now seems that there are only benefits while there is still something to be said about the last 2 points. AI and the algorithms can take the human (un)conscious assessment out of the selection process, which is positive. But you have to be alert here that AI models are usually a result of the data you put in (the training data set). We often see that these are data models of high performers or of other uniform groups of employees in the organisation who share certain characteristics.
Based on this, a ‘model candidate’ is made with which the job applicants are compared. This gives an probabilistic estimate of the match between the candidate and the job. Theoretically, this sounds very promising. But if there is bias in your data or your training data set and the algorithms are not corrected for this, AI will only exacerbate the problem of bias in selection.
Subtle and unintended bias from AI
If you are going to predict football performance among 15 year olds, you will see that the oldest boys in the selection – born in January, February or March – are often the best footballplayers. This is because boys of 15 years are still growing and the slightly older boys are physically superior and therefore seem to be better players. A beautiful book has been written about this phenomenon by Malcom Gladwell; “Outliers, the Story of Success”. Bias suddenly takes the form of a birth month effect. It is therefore important that you try to avoid that your dataset used for the algorithm is too pure. It is not always transparent how an algorithm has been developed with which social acceptance is discussed. Candidates (rightly) wonder whether the criteria against which they have been set are correct. Who can reassure them that the algorithm does not select by age, for example?
What can you do to use AI in your selection process as well and objectively as possible?
Conclusion; AI is no panacea
Avoiding bias in the selection process is difficult, even if you leave it to a computer, you have to stay alert. By mapping as many unconscious biases as possible in advance and continuing to correct them, you reduce the chance of excluding a candidate based on (un)conscious bias. It remains important to continuously refresh your knowledge of new technology and AI in order to make the best choices for the recruitment policy.