Friday Night Film Series Presents: Coded Bias

Sarah Speegle, Maroon-News Staff

It is no surprise that the world is becoming increasingly reliant on technology. From smartphones to cars, technology is an integral and critical part of our everyday lives. While it is widely assumed that technology is beneficial and essential to the modern world, filmmaker Shalini Kantayya challenges this assumption through her recent documentary, Coded Bias. By examining the ways in which facial recognition technologies fail to identify darker-skinned faces and the faces of women, Kantayya exposes the inherent biases encoded in artificial intelligence and subsequently calls attention to the fact that these algorithmic systems often do more harm than good. 

On Friday, Nov. 13, Colgate’s Friday Night Film Series hosted a live Q&A with Kantayya for students, faculty and community members to learn about the documentary and engage with the filmmaker face-to-face. Facilitated by Assistant Professor of Film and Media Studies Neta Alexander, the Q&A allowed Kantayya to share her reflections on the film and highlight a few key points. Kantayya began by explaining how Coded Bias deals with communities that are most hurt by algorithmic systems. Through the process of filmmaking, she discovered that marginalized groups are disproportionately affected by artificial intelligence bias; correcting these technological failings is essential for future innovations in the field. She believes the best thing to do right now is to press pause on the use of facial recognition and artificial intelligence technology until more laws and regulations are put in place. 

Kantayya also brought up issues surrounding data rights and how people need to start viewing data rights as basic human rights. Even though many people have heard of artificial intelligence and algorithmic systems, Kantayya encourages individuals to start asking more in-depth questions about how these technologies actually work considering the impact they have on the entire population. By learning more about these systems, people can become aware of the problems associated with them and can initiate change. Through discussing these issues, Kantayya made the audience reflect on the documentary and their understanding of artificial intelligence as a whole.

“I was not previously aware that these types of technologies existed, but learning about them was very alarming and kind of freaky. It was especially interesting to hear how the filmmaker didn’t intentionally use women of color in her documentary, but they ended up being the main protagonists of the film just due to the fact that these technologies have racial and gender biases. It is kind of scary to think that the technologies that are supposed to progress us forward are actually holding us back through these racial and gender biases. While it is refreshing to see that some cities have started to press pause on using artificial intelligence, it is hard to know what the future will look like,” junior Ashley Thomas said.

 As the event came to a close, Kantayya emphasized that she hopes Coded Bias becomes a conversation piece, and that people will use the documentary as inspiration to have open dialogues about artificial intelligence and how these algorithmic systems impact our civil liberties.