Coded Bias Overview: When the Bots Are Racist


While working on a job with facial recognition software, the M.I.T. Joy Buolamwini, a researcher at the Media Lab, found that the algorithm couldn't recognize her face – until she put on a white mask. As she relates in the documentary "Coded Bias", Buolamwini soon discovered that most of these artificial intelligence programs are trained to identify patterns based on records that are fair-skinned and male.

"When you think of A.I, it is forward-looking," she says. "But A.I. is based on data and data reflects our history."

Directed by Shalini Kantayya, Coded Bias examines how machine learning algorithms – ubiquitous today in advertising, recruitment, financial services, policing, and many more – perpetuate existing racial, class, and gender inequalities in society can.

Coded Bias is the most divided documentary on the dangers of Big Tech ("The Great Hack", "The Social Dilemma") and addresses its broad subject by sensitively focusing on the human cost. Using Buolamwini's journey from her research to a congressional hearing on facial recognition technology, Kantayya uses Buolamwini's journey as a line of passage, connecting a range of local and international stories with an eye for emotional detail. A teacher in Houston reports that despite years of experience and awards, he received an arbitrarily poor algorithmic rating. A brave surveillance group in London challenges the police to use A.I.-based video surveillance cameras, which often misidentify pedestrians and profile them in a racist manner.

The film skilfully moves between pragmatic and bigger political criticism, arguing that it's not just the technology that is flawed. Even if it were perfect, it would dangerously violate people's freedoms. One segment describes China's efforts to create a “social credit” program that uses facial scans to track citizens' lives and generate results that control their access to various services.

America is not much different, warns futurist and writer Amy Webb, one of the film's experts (mostly women, refreshing). She says that in the United States, social media companies, other businesses, and law enforcement agencies monitor people and influence their information and opportunities in similar ways. They just aren't that open.

Such statements are dystopian enough that the music and graphics, inspired by 2001: A Space Odyssey and overlaid by Kantayya, can feel cheesy. Still, they add an aptly heroic note to the movie's activists – especially Buolamwini, whose efforts have brought tangible legislative benefits. For a documentation about automated technology, “Coded Bias” focuses firmly on people: their mistakes, their weak points and their power forever.

Coded preload
Not rated. Running time: 1 hour 30 minutes. Watch in virtual cinemas.