In every reported case where police mistakenly arrested someone using facial recognition, that person has been Black::Facial recognition software has always had trouble telling Black people apart, yet police departments are still using it.
to be fair that happened a lot before AI existed
This isn’t new. It’s been a known problem for a long time, because facial recognition software is trained using white people. So it gets really really good at differentiating between white people. But with black people as a tiny fraction of the sample data, it basically just learns to differentiate them with broad strokes. It’s good at telling them apart from white people, but not much else.
It’s not just a training issue. Lighter (color) tones reflect. Dark tones absorb. There have been lots of issues with cameras or even just sensors having issues with people having dark skin tones because the lower reflectivity/contrast of dark tones. 3D scanners - even current models - have similar issues with objects having black parts for similar reasons. Training on more models can help, but there’s still an overall technical difficulty to overcome as well (which is also a good reason that using facial recognition in this manner is just bullshit, period).
As a technological problem it could have a technological partial solution: the darker the skin, the higher the threshold to declare a match. This would also mean more false negatives (real matches not caught by the software) but not much to do about that.
I’m interested what dataset they’re using because simply adding more black people to the training set seems like a pretty straightforward fix.
It seems like past mugshots would be an ideal part of the training set. Are they not using those?
Personally, I’m leaning towards “It’s not the image recognition program that’s the problem here.”
Isn’t face recognition just going to be inherently rly less reliable on darker skinned people? Their features would certainly have less contrast on darker skin, no?
Doesn’t the fact that a technology is fundamentally discriminatory mean we should question the use of that technology? Not just shrug our shoulders and say too bad?
Shouldn’t use it at all, but the tech isn’t intentionally malicious, just a fact of the tech.
Exposure is a dial, not a technology. People are choosing to use it this way.
yes
Why not expose pictures longer to get better features of darker skinned people and less accurate of lighter skinned people, leading to more false arrests of lighter skinned people?
And remove the convenient excuse to harass black people? These are cops we’re talking about.
Who could possibly ever have foreseen that.
Fuck the police!
If Black Then arrest
If white, politely ask
So it’s acting like real cops and lying about black people “fitting the description.”
Removed by mod