Facial Recognition Technology falsely identifies women more than men; find out why.

MsBirgith

Artificial Intelligence (AI) has yet to minimize human biases and can often perpetuate existing racial and sexist biases on a much larger scale. This article will discuss how likely women are to be falsely identified with the current facial recognition algorithms (FRA), why this is continuing to happen, and potential solutions.

Understanding Sexism in FR Tech and Why It Is A Problem

WatchGuard researched to analyze the gender bias in Facial Recognition Technology (FRT) and found that women were misidentified 18% more than men. They studied two FR software, the first, Amazon Rekognition, and the second, Dlib.

https://img.particlenews.com/image.php?url=1RF74B_0dArjnTz00
Facts about Women in ColorOriginal template

The research found that while Amazon’s software could recognize white men at a 99.06% accuracy, white women at 92.9% accuracy, while Women of Colour (WOC) were recognized at a 68.9% accuracy.

https://img.particlenews.com/image.php?url=1K2dDC_0dArjnTz00
Women in colorClarke Sanders

What does this mean? According to WatchGuard Technologies, it “essentially means a female face not found in the database is more likely to provide a false match. Also, because of the lower similarity in female faces, our team was confident that we’d see more errors in identifying female faces over males if given enough images with faces”. So the more data collected, the more errors would occur in accurately identifying women.

While this may not be a significant problem when matching faces to tag people in Facebook or Instagram photos, it is more damaging when government agencies and law enforcement use this software.

Not only are there disparities and inaccuracies in gender identification, but also racial identification. An article by FR company Facedapter titled Racial Bias in FR Technology details the real-life effects of this in modern-day society.

To further emphasize the existence of sexism in FRT, the University of Washington “revealed that a facial recognition tool was 68% more likely to predict that an image of a person cooking in the kitchen is that of a woman”.

https://img.particlenews.com/image.php?url=37R3uX_0dArjnTz00
“revealed that a facial recognition tool was 68% more likely to predict that an image of a person cooking in the kitchen is that of a woman”MsBirgith

Some might say that these results do not necessarily mean that FRT is sexist but simply reflects society, and that is correct. However, AI did not invent sexism or racism but is collecting and using data that, if not corrected, could perpetuate the already dangerous and potentially discrimination, especially as it is rolled out to the public.

FRA today relies on what is called machine learning (ML). ML is “a method of data analysis that automates analytical model building. It is a branch of artificial intelligence based on the idea that systems can learn from data, identify patterns and make decisions with minimal human intervention”.

This training process requires significant training but, unfortunately, can result in biases and the perpetuation of these biases in AI. The biases reflected in the data occur if there is no accurate representation of the given demographic an FR scanner is programmed to identify.

For example, using an FRA with collected data from Poland in Nigeria is bound to have more than a few pitfalls. ML will not be able to identify the given population accurately.

Bearing that in mind, the public and FR service providers must understand that until the day that AI works to eliminate human biases, its results cannot be considered any more trustworthy than a human’s. So what is the solution?

The Solution

Contrary to popular belief, there are several solutions, the first being diversifying data. Data scientists at the MIT Media Lab recognize that the resulting data has been more accurate and less discriminatory in instances where they have diversified the data used to train AI machines. This makes it an important starting point in reversing and preventing the perpetuation of biases rooted in misogyny and racism.

Companies that place profit over people are quite common, and these companies are less likely to be transparent with the public, where flaws in the software are concerned. Transparency would benefit the providers of FR service providers in addressing the software flaws and holding them accountable.

Startup FR company Facedapter is an example of a company that is looking to build digital trust one face at a time. “Our goal is to be the simplest, fastest, and most cost-effective multimodal facial recognition software on the market; Being that is what enables us to, also, be the most demographically sensitive software on the market.”

https://img.particlenews.com/image.php?url=39IeMy_0dArjnTz00
Startup FR company Facedapter is an example of a company that is looking to build digital trust one face at a timeMsbirgith

A company that understands the “-isms” that heavily influence society (sexism, racism, etc.) and is looking to eliminate these biases is a company that sees the value in integrity and equal treatment.

Facedapter working to become the most demographically sensitive software on the market will make it an FRS that eliminates gender identification and racial discrimination inaccuracies.

This will make it the best option for public spaces and government and law enforcement agencies.

Comments / 5

Published by

I am a Social Media Marketer, Photographer, and Believe in the Right attitude and mindset. Ten years ago, I started my Marketing and Blogging Journey. Back then, I had no idea that ten years later, It will be my full-time work and business. Now I help other entrepreneurs, business owners, and startups make RIGHT marketing decisions. To do well with your business, you need first a RIGHT MINDSET, then work ethic, a business & MARKETING launch. Whatever comes next is a bonus, but everything starts from the mindset. In my stories, you will find helpful information on starting a business and taking care of your mind and health—my advice and mistakes I have made—also, all the updates from Social Media Marketing, branding, and how to stand out.

5K followers

More from MsBirgith

Comments / 0