Even The Best Facial Recognition Algorithms Mismatch Black Women’s Faces More Than White Men’s

The conversation regarding facial recognition has become a staple in most legal, academic, and tech spaces and one of the topics that are always raised is the ability, or the lack thereof, of the technology to identify a person; let alone a black-skinned individual accurately.

A recent study commissioned by the US government has revealed that even the most advanced facial recognition systems have meager accuracy rate in terms of identifying black men and women; relatively compared to white men and women.

National Institute of Standards and Technology’s research indicated that two of Idemia’s latest algorithms were significantly more likely to mix up black women’s faces than those of white women, or black or white men.

Idemia is a French facial recognition company, whose technology is used by law enforcement agencies in the US, Australia, and France. The company provides the facial recognition system for the US against Customs and Border Protection records.

US officials have well-acclaimed the system. In 2017, a top FBI official told that Idemia’s system that scans more than 30 million mugshots a year helps “safeguard the American people.”

However, the NIST test proved that the system has a flaw that carries substantial implications. When challenged to verify two photos of the same person, the facial recognition software matched different white women’s faces at a rate of one in 10,000. Worse, the tests reveal that the when used against black women, the inaccuracy comes in a scale of one every a thousand times, which is ten times more frequently than that of white faces.

But Idemia defends itself saying that what was being tested by the NIST is not yet available commercially and is still in the last phases of its development. According to Donnie Scott, who leads the US public security division at Idemia, previously known as Morpho, said that the company is checking demographic differences during their product development phases. He claims that the most likely reason for the results that came out of the NIST tests is because of engineers pushing their technology to get the best overall accuracy rates.

“There are physical differences in people, and the algorithms are going to improve on different people at different rates,” he says.

This is not the first study that reveals the racial bias against facial recognition technologies. Back in March this year, a study conducted by researchers from the Georgia Institute of technology, reveals that the state-of-the-art facial recognition systems used by smart cars yield lower accuracy rates in detecting pedestrians with dark skin tones.

The results revealed that the accuracy level of the image recognition systems that were tested was five percent lower with the darker skin category than those with lighter skin tones. The result even held right even when controlling for time of day and obstructed the view.

Further back in January, Amazon’s Rekognition software, the company’s bid to the facial recognition market, was found by a study conducted by researchers from MIT and the University of Toronto to mistake dark-skinned women to men. Results have shown that Amazon’s facial analysis have mistaken 31% of black women as men compared to 7% of white women being mistaken to men, with the results revealing no misidentification on men subjects.

Nonetheless, it is the NIST that has given governments the scientific justification into using facial recognition systems, especially in law enforcement. Last year, NIST said that best algorithms got 25 times better at matching people against a large database between 2010 and 2018 with only 0.2 percent mismatch.

But as have been established, NIST’s test, as well as other independent and academic studies, have repeatedly proven the inaccuracy of facial recognition systems when used against people with dark skin. In the agency’s July test, they have shown that the same ten-fold inaccuracy gap has also been seen in top-performings systems among 50 companies that were tested. A similar trend has also been seen when results are compared between men and women.

Many facial recognition algorithms are more likely to mix up black faces than white faces. Each chart represents a different algorithm tested by the National Institute of Standards and Technology. Those with a solid red line uppermost incorrectly match black women’s faces more than other groups. Photo: NIST

“White males … is the demographic that usually gives the lowest FMR,” or false match rate, the report states. “Black females … is the demographic that usually gives the highest FMR.”

Furthermore, the results highlighted the risk of using facial recognition technology by law enforcement agencies, which has previously proven to be highly inaccurate. Many privacy advocates have been calling for the regulation of the technology citing its implications on people’s privacy and security, with cities and governments responding by banning facial recognition just like what San Francisco and other US cities did.

About the Author

Al Restar
A consumer tech and cybersecurity journalist who does content marketing while daydreaming about having unlimited coffee for life and getting a pet llama. I also own a cybersecurity blog called Zero Day.

Be the first to comment on "Even The Best Facial Recognition Algorithms Mismatch Black Women’s Faces More Than White Men’s"

Leave a comment

Your email address will not be published.


*