Coded Bias, Shalini Kantayya

LOS ANGELES: Facial recognition systems from large tech companies often incorrectly classify black women as male — including the likes of Michelle Obama, Serena Williams and Sojourner Truth. That’s according to Joy Buolamwini, whose research caught wide attention in 2018 with “AI, Ain’t I a Woman?” a spoken-word piece based on her findings at MIT Media Lab.

The video, along with the accompanying research paper written with Timnit Gebru of Microsoft Research, prompted many tech companies to reassess their facial recognition data sets and algorithms for darker and more female-looking faces.

Coded Bias,” a documentary directed by Shalini Kantayya which premiered at the Sundance Film Festival earlier this year, interweaves Buolamwini’s journey of creating the Algorithmic Justice League, an advocacy organization, with other examples of facial recognition software being rolled out around the world — on the streets of London, in housing projects in Brooklyn and broadly across China.

“As sci-fi writers have inspired the imagination of AI developers, Coded Bias draws from science-fiction stylistic elements to visualize concepts in this new era of big data,” says Shalini, the film’s director. “Coded Bias aims to inspire communities to spark new conversations about bias in the algorithms that impact civil liberties and democracy.”

Artificial Intelligence (AI) technology is becoming more prevalent in areas that can profoundly impact people’s lives such as law enforcement and human resources. The film focuses on the work of three female mathematicians and data scientists — Joy Buolamwini, Deborah Raji, and Timnit Gebru — who research the implications of racial and gender bias in the cutting-edge technologies of AI and machine learning. Are the biases that exist in society being unconsciously replicated in the technology? Or worse, are opaque systems being used to shield deceptive practices