Facial recognition software used by the police in France, Australia, and the US uses the algorithms produced by French company Idemia.
This software is known to scan faces by the millions, and this is why the 2017 Congress thought that it would ‘safeguard the American people.’ Unfortunately, there’s a problem with it.
Scours 30 Million Mug Shots but Doesn’t Always See Race Clearly
In July, it was found that this latest Messiah for the US police department was very likely to mix up the faces of Black and White men or women. When the algorithm was put to the test with some new sensitivity settings, it was found that it falsely matched White women by a rate of 1 in 10,000 but also falsely matched Black women every 1 in 1000.
For those of you who are keeping track, this means that the algorithm messed up Black women’s results ten times more frequently!
So Is The Algorithm Racist?
Well, the tests being conducted on this facial recognition software have found that the algorithm has trouble recognizing people who have darker skin. What’s more? The algorithm also worked less efficiently in the case of women than men – and this problem is currently being contributed to the daily use of makeup.
“White males … is the demographic that usually gives the lowest FMR,” or false match rate, the report states. “Black females … is the demographic that usually gives the highest FMR.”
These Concerns Are Apparently Far From New
Apparently, these concerns were first raised in 2012 in a research paper laid out by the best facial recognition expert in the FBI. His research paper had indicated that facial recognition systems are almost always less accurate when it comes to Black men and women.
Similarly, experts had proved earlier in 2018 that IBM and Microsoft’s facial recognition services experienced trouble detecting race and gender. This was never an issue for men with pale skin, but it was a massive problem for Black women. Even Amazon went as far as saying these facial recognition results were ‘misleading’ in a very aggressive blog post.
Experts working on removing the chinks in this cutting edge technology believe that different algorithm sensitivities will have to be used for both racial groups. By doing so, there will be less risk of discrimination and lawsuits regarding racial profiling. Without a fix, however, facial recognition software won’t be considered substantial evidence in any scenario.