Algorithms of oppression

Coded Bias, directed by Shalini Kantayya, reviewed by Joh Foster.

Can you imagine the surprise of Joy Buolamwini, a PhD candidate at Massachusetts Institute of Technology (MIT) Media Lab, upon discovering that her darker-skinned face was not identified using facial recognition software? Coded Bias is the story of what Joy does next – how her discovery was far from being an isolated incident, and how she ultimately forces change in US law.

This documentary, from director and activist Shalini Kantayya, asks the important questions of ‘What does it mean when artificial intelligence (AI) increasingly governs our liberties?’ and ‘What are the consequences for the people AI is biased against?’ It investigates the widespread bias inherent in algorithms, showing that these decision-making tools are far from neutral, yet impact society at large.

What we discover from watching this piece is that many of the ideas about intelligence come from a relatively small, homogenous group of people – think Zuckerberg, Bezos, Jobs, and Gates, and this is exemplified by the very birth of the AI field in 1956 at Dartmouth College. The math department, consisting solely of white men, essentially defined the meaning of AI and the datasets that would be used. The result of this is that bias is embedded into the very technology that governs our lives, decides who gets mortgages, who is accepted for credit, and even who gets a job.

While the film starts in the US, the viewer is introduced to Silkie Carlo, the director of the UK-based Big Brother Watch, a civil liberties group. One of their campaigns is in protest to the silent implementation of facial recognition surveillance cameras, arguing that faces are as sensitive a piece of biometric data as fingerprints. The software is also inaccurate, with some 98 per cent of matches incorrectly identifying innocents as being ‘wanted’. In one example, we see a young schoolboy stopped by Metropolitan Police because his face, captured using the software, is flagged as being a match for a suspected criminal. The lack of oversight and regulation is alarming to say the least.

We are also taken to China where this kind of tracking using AI is overt, and access to services is either approved or denied based on performance data of individuals. A social credit score is earned in an authoritarian yet effective use of what is essentially algorithmic obedience training and even drives social interactions in the population. One citizen claims she can find suitable friends quicker because she uses their score as a factor for her social interactions.

The problem with AI is that the programming itself propagates bias, and there needs to be wholesale acknowledgement that our blind faith in big data is misplaced and potentially destructive. Humans have essentially created machines in their own image, using outdated values of racism, sexism and ableism. These prejudices become mechanised, automated, uncontrolled. A frightening example in the film was that of a learning chatbot named Tay on social media platform, Twitter. Tay learned from the Twitter ecosystem – i.e. humans – to be misogynistic, xenophobic and antisemitic, resulting in the account being closed within 16 hours of being set up.

In the same way that The Social Dilemma explores the damage social media continues to do to society, Coded Bias is an urgent call to arms to ensure that AI is fair and works for all of society. After all, intelligence without ethics is not intelligence at all. In the closing moments, Zeynep Tufekci (PhD), author of Twitter and Teargas, says ‘Being fully efficient, always doing what you’re told, also doing what you’re programmed is not always the most human thing. Sometimes it’s disobeying, sometimes it’s saying no, I’m not gonna do this, right? And if you automate everything so it always does what it’s supposed to do, sometimes that can lead to very inhuman things.’ I can’t help but think she’s probably right.

Reviewed by Joh Foster, Organisational Psychologist and Co-founder of Centre for Psychology at Work
Twitter: @TheWiseFoster

Coded Bias is available on Netflix.

www.codedbias.com

BPS Members can discuss this article

Already a member? Or Create an account

Not a member? Find out about becoming a member or subscriber