Is Bias Coded Into AI? A New Documentary Screams Yes And Wants Legislation To Change It.


Joy Buolamwini was a researcher at the MIT Media Lab when she noticed that the AI-driven facial recognition software she was using on a project could not detect her face. Buolamwini is African-American and experimented with wearing a mask to see if that helped the software identify her facial features. “When I put on the white mask, detected. When I take off the white mask, not so much,” Buolamwini says in the documentary ‘Coded Bias’ that was released in the U.S. this week. 

Slated as an exploration of the fallout from that startling discovery, Coded Bias chronicles Buolamwini’s journey in seeking U.S. legislation to govern against bias in the algorithms that have become so ubiquitous in our current lives and will only become more so in the future. That expedition took Buolamwini down a path to learn what inputs were fed into the system, that have resulted in AI not recognizing darker-skinned faces and those of women. “I started looking at the datasets themselves and what I discovered is many of these datasets contain majority men, and majority lighter-skinned individuals. So they systems weren’t as familiar with faces like mine.” 

The author of Artificial Unintelligence, Meredith Broussard, agrees. Broussard has also looked into the roots of AI to understand who the first researchers were, and how their demographics influenced the early days of its development. “The people who were at the Dartmouth math department in 1956 got to decide what AI was,” Broussard states in Coded Bias. She concludes that “everyone has unconscious bias and people imbed their own biases into technology.” 

The results of Buolamwini’s investigation into algorithms came to similar conclusions. “If you have largely skewed datasets that are being used to train these systems you can also have skewed results,” she states. “When you think of AI it is forward-looking, but AI is a reflection of history. So the past dwells within our algorithms. This data is showing us the inequalities that have been here.” 

Coded Bias extrapolates from this premise to ask big questions about  the impact AI has on our lives. Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence (AI) increasingly governs our liberties? And what are the consequences for the people AI is biased against?,” the documentary synopsis reads. “As it turns out, artificial intelligence is not neutral, and women are leading the charge to ensure our civil rights are protected.” 

Cathy O’Neill, a former quant on Wall Street, is another woman dissecting how algorithms can be biased. In her book ‘Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, O’Neill posits that mathematical models are pervading modern life and ripping apart our social fabric. “We have all of these algorithms in the world that can be increasingly influential and they are all being touted as objective truth. I started realizing that mathematics is being used as a field for corrupt practices,” O’Neill says in Coded Bias. “Before we had the algorithm, we had humans. And we all know that humans can be unfair. We all know that humans can issue racist or sexist or ableist discriminations.” O’Neill says she is very worried about the ‘big blindfold’ that exists in big data, and advises that we should be monitoring every process for bias. 

O’Neill, Broussard, and Buolamwini are banding together with other experts to form a ‘justice leagues’ of sorts, with a mission to propagate more humane and ethical AI to shape the future in a more egalitarian fashion. It is a big job and this documentary is an important part of bringing awareness to the general public about conscious and unconscious bias being built into our technological infrastructure. As a Hollywood Reporter review of the documentary astutely summarizes, “this fast-moving, dynamically assembled film — which makes sharp use of classic sci-fi visual concepts and screen graphics to echo its themes — is here to remind us that the struggle between humans and machines over decision-making is only going to get messier.” 

Artificial Intelligence is being implemented in industries all over the world and is a central theme of the research undertaken at UCIPT. Our work in the HOPE study is using data to assess and shift behavioral outcomes among HIV and other populations.

Leave a comment

Your email address will not be published. Required fields are marked *