A still from "Coded Bias."

On Monday, IBM made a monumental announcement: the company is getting out of the facial recognition business, citing racial justice concerns and the need for legal oversight.

Racial and gender bias is at the heart of Shalini Kantayya’s documentary “Coded Bias,” which investigates the corporate and societal implications of machine-learning systems when left unchecked. The film focuses on the work of several female mathematicians and data scientists — “outsiders,” by way of gender, race or sexuality. IBM’s decision was influenced by research presented by Joy Buolamwini and coresearchers Deborah Raji and Timnit Gebru, who are all featured in the film.

Buolamwini, a Black Ph.D. student in MIT’s Media Lab, is the central character of the documentary, and focuses her research on issues of biases in facial recognition and artificial intelligence. She founded the Algorithmic Justice League, which pushes for greater data oversight for large companies using data to create AI systems with outsized influence.

“The proof of this technology being racist was done by Black women, and done by the Black women in my film. I’m really proud of them for making real change,” Kantayya says. “I’m cheering on IBM, and I’m challenging Amazon to do the same. Amazon’s made statements about them standing out against racism, while they continue to market and sell technology that is proven to be racist. [Late Wednesday, Amazon announced that they will ban police use of its facial recognition software for one year.] Until we know that this technology is unbiased, until we know that it’s fair, until there’s some public schools in place to govern this technology from abuse, we need to press pause on facial recognition.”

A still from "Coded Bias."

Joy Buolamwini in a still from “Coded Bias.”  Courtesy

The documentary premiered at Sundance and will screen this week online as part of the Human Rights Watch Film Festival and next week with the AFI Docs Film Festival. Kantayya teases a forthcoming announcement about wider distribution. “Coded Bias” is the Brooklyn-based filmmaker’s second feature documentary; her first, “Catching the Sun,” which was executive produced by Leonardo DiCaprio, looked at the clean energy movement.

Kantayya’s documentary got a boost from its Sundance premiere, where it was well received by film and tech insiders. “I had someone who worked at Google say to me, ‘We’ve been having this conversation among ourselves, and you made a film that is a conversation that we can have with everyone.’ And I thought that was a really strong support of the film,” she says.

Director Shalini Kantayya

Shalini Kantayya  Courtesy

“Coded Bias” makes the argument that math is being used as a shield for deceptive practices in machine learning, and questions blind faith in big data. At the same time, many of the data sets used to feed machine learning systems contain biases that exist in society. The issues highlighted in the film underscore the need for transparency and checks for accuracy and bias. Because the systems are opaque, it is difficult to discern whether someone has been the victim of systemic discrimination via algorithmic bias.

“We’re living in an age where if you say something, and a computer says something, you’re obviously wrong,” Kantayya says. “It’s this blind faith in big technology and the invisible automated decision-makers that are making massive decisions about who gets opportunities in life, that is deeply troubling.

She presents algorithmic justice issues as a human rights issue, and ground zero for the battle for civil rights and democracy in the 21st century. At the same time, there’s a lack of public understanding. The implications of machine learning include what products and services are targeted to whom — aligned with standard marketing practices — to more troubling applications, such as using biometric data to create unique, individual profiles.

A still from "Coded Bias."

A still from “Coded Bias.” 

Left unchecked, systems created from data have the potential to automatically eliminate certain demographics from job searches or college admissions, determine health coverage, and make troubling criminal justice recommendations. In one example, Kantayya highlights how one AI recruiting tool used by Amazon was biased against women, penalizing graduates of women’s colleges and résumés that mentioned female-centric clubs. (The tool has reportedly since been scrapped.)

“It’s going to roll back all of the civil rights advances that we made over 50 years,” she says. “I came to see how algorithm justice dovetails with basic freedoms we’ve been guaranteed in the Constitution: the right to assembly, the freedom to associate,” she says. “And so if we want to hold on to our civil rights and our democracy we really have to empower ourselves around these issues.”

A still from "Coded Bias."

A still from “Coded Bias.” A housing complex in Brooklyn introduced facial recognition to gain access to the building. 

The film prominently features “Weapons of Math Destruction” author and data scientist Cathy O’Neil and Big Brother Watch director Silkie Carlo. The organization is monitoring how facial recognition is being tested by the police on the streets of London — despite high rates of inaccuracy — to identify criminality. In one scene, Buolamwini captures the aftermath of a 14-year-old Black teen boy in a school uniform being wrongfully stopped by the police. The boy, one of several thousand to get mistakenly stopped because of the technology, is visibly confused as a Big Brother Watch activist attempts to explain why he was stopped by five plainclothes police officers.

“If you live in New York, you know what the impact of stop and frisk has been on communities of color,” Kantayya says.

The film also presents China’s widespread use of biometric facial recognition and individual “social scores” in everyday life. The technology can be used to identify and jail protesters; in late 2019, government protesters in Hong Kong donned masks and destroyed CCTV cameras. With large numbers of protesters taking to the streets Stateside this month to speak out against police brutality, conversations around how facial recognition is used — and by whom — is more urgent than ever.

A still from "Coded Bias."

A still from “Coded Bias.” A Chinese citizen’s social profile.  Courtesy

More than 117 million American faces are already included in police databases, which are being used to develop machine learning systems for use by the police, the FBI and ICE — so far, with no government oversight.

Later in the film, Buolamwini and her team present this information along with their research in front of politicians including Rep. Alexandria Ocasio-Cortez, D-N.Y., and Rep. Jim Jordan, a leading figure in the Republican Party, during a congressional hearing. Jordan expressed particular concern to the fact that American faces from police data bases are being used with no government oversight.

“He was as terrified as any Democrat, which was crazy,” Kantayya says. “We need systems and legislature that understand these systems so we can govern them. Just a few companies are having an outsized amount of power in a society — that’s not democratic.”

As to what she hopes viewers will take away from watching her film, Kantayya identifies an easy starting point: people start questioning the technology they use everyday.

“I hope that people will start to question this blind faith we have in technological systems,” she says. “And peel away that magic and see that technology is only as good as the human in it.”

A still from "Coded Bias."

“Weapons of Math Destruction” author and data scientist Cathy O’Neil, in a still from “Coded Bias.”