How algorithms are now ruling our lives

Finding work used to be largely a question of whom you knew. For decades, that was how people got a foot in the door. Candidates then usually faced an interview, where a manager would try to get a feel for them. All too often this translated into a single basic judgment: is this person like me (or others I get along with)? The result was a lack of opportunity for job seekers, especially if they came from a different race, ethnic group, or religion. Women also found themselves excluded by this insider game.

Today, human resources managers rely on data-driven algorithms to help with hiring decisions and to navigate a vast pool of potential job candidates. These software systems can in some cases be so efficient at screening resumes and ‘evaluating’ people that over 70 per cent of CVs are weeded out before a human ever sees them. But there are drawbacks to this level of efficiency. Man-made algorithms are fallible and may inadvertently reinforce discrimination in hiring practices.

Any HR manager using such a system needs to be aware of its limitations and have a plan for dealing with them as in affect, algorithms are, in part, our opinions embedded in code. They reflect human biases and prejudices that lead to machine learning mistakes and misinterpretations.

This bias shows up in numerous aspects of our lives, including algorithms used for electronic discovery, teacher evaluations, car insurance, credit score rankings, and university admissions. At their core, algorithms mimic human decision making. They are typically trained to learn from past successes, which may embed existing bias.

However, it was all created with the best intentions in mind. By having computer programs sifting through thousands of CVs or loan applications in a second or two, putting the most promising candidates at the top, did not only save time but it was also marked as a fair and objective way of doing things. After all, it didn’t involve prejudiced humans digging through reams of paper, just machines processing cold numbers.

As this trend quickly became the norm, mathematics was asserting itself as never before in human affairs, and the public largely welcomed it. The goal was to replace subjective judgments with objective measurements in any number of fields – whether it was a way to locate the worst performing teachers in a school or to estimate the chances of a person staying in his job for more than a year.

But although their popularity relies on the notion they are objective, algorithms that power the data economy are based on choices made by fallible human beings. And, while most of them are made with good intentions, the algorithms encode human prejudice, misunderstanding and bias into automatic systems that increasingly manage our lives. As mentioned before, these mathematical models are opaque, their workings invisible to all but the highest priests in their domain: mathematicians and computer scientists. Their verdicts, even when wrong or harmful, are beyond dispute or appeal.

As a result, there’s enormous opportunities for manipulation in big data, and there’s something to be said to remain a bit skeptical and vigilant about this process. Biases and prejudices will most likely continue to play a role in recruitment processes, whether its algorithmic or human methods. But as big data, machine-learning algorithms, and people analytics take on a larger and more influential role in recruiting, it is reasonable to question to what degree we can really rely on technology.

It may be possible to predict what personal attributes would be required for success in a role but can we really write an algorithm that can determine the potential to succeed in a future we don’t yet understand? Should we continue to rely on the human approach, riddled with bias and a poor record of decision making? Perhaps the answer sits in the middle.

Written by Alex Kostin at Bet-bonuscodes.co.uk

Share this story

0 0 vote
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
Close
Menu
0
Would love your thoughts, please comment.x
()
x
Send this to a friend