How Coded Bias Makes a Powerful Case for Algorithmic Justice

Facebook
Twitter
LinkedIn

What do you do when your computer can’t recognize your face? In a previous blog, we explored the potential applications for emotional AI. At the heart of this technology is the ability to recognize faces. Facial recognition is gaining widespread attention for its hidden dangers. This Coded Bias short review summarizes the story of female researchers who opened the black box of major applications that use FR. What they found is a warning to all of us making Coded Bias a bold call for algorithmic justice.


Official Trailer

Coded Bias Short Review: Exposing the Inaccuracies of Facial Recognition

The secret is out, FR algorithms are a lot better at recognizing white male faces than of any other group. The difference is not trivial. Joy Buolamwini, MIT researcher and main character in the film, found that dark-skinned women were miss-classified up to 35% of the time compared to less than 1% for male white faces! Error rates of this level can have life-altering consequences when used in policing, judicial decisions, or surveillance applications.

Screen Capture

It all started when Joy was looking for facial recognition software to recognize her face for an art project. She would have to put a white mask on in order to be detected by the camera. This initial experience led her down to a new path of research. If she was experiencing this problem, who else and how would this be impacting others that looked like her. Eventually, she stumbled upon the work of Kathy O’Neil, Weapons of Math Destruction: How How Big Data Increases Inequality and Threatens Democracy, discovering the world of Algorithmic activism already underway.

The documentary weaves in multiple cases where FR misclassification is having a devastating impact on people’s lives. Unfortunately, the burden is falling mostly on the poor and people of color. From an apartment complex in Brooklyn, the streets of London, and a school district in Houston, local activists are mobilizing political energy to expose the downsides of FR. In doing so, Netflix Coded Bias shows not only the problem but also sheds light on the growing movement that arose to correct it. In that, we can find hope.

If this wasn’t clear before, here it is: watch the documentary Coded Bias multiple times. This one is worth your time.

The Call for Algorithmic Justice

The fight for equality in the 21st century will be centered on algorithmic justice. What does that mean? Algorithms are fast becoming embedded in growing areas of decision-making. From movie recommendations to hiring, cute apps to judicial decisions, self-driving cars to who gets to rent a house, algorithms are influencing and dictating decisions.

Yet, they are only as good as the data used to train them. If that data contains present inequities and or is biased towards ruling majorities, they will inevitably disproportionately impact minorities. Hence, the fight for algorithmic justice starts with the regulation and monitoring of their results. The current lack of transparency in the process is no longer acceptable. While some corporations may intended to discriminate, their neglect of oversight makes them culpable.

Because of its ubiquitous impact, the struggle for algorithmic justice is not just the domain of data scientists and lawmakers. Instead, this is a fight that belongs to all of us. In the next blog, I’ll be going over recent efforts to regulate facial recognition. This marks the next step in Coded Bias call for algorithmic justice.

Stay tuned.

More Posts

A Priest at an AI Conference

Created through prompt by Dall-E Last month, I attended the 38th conference of the Association for the Advancement of Artificial Intelligence. The conference was held

Send Us A Message

Don't know where to start?

Download our free AI primer and get a comprehensive introduction to this emerging technology
Free