Viewing a single comment thread. View all comments

sanspoint_ t1_j18xq6e wrote

Yup. People tend not to believe it, but a lot of facial recognition algorithms are trained on biased data sets, mostly the faces of the young white men who do most of the development of those algorithms. Because of this, they'll often match people of color with the wrong face in the database. People of color have been hauled in by the police under suspicion of having committed a crime because a facial recognition algorithm decided they look too much like a wanted criminal who also happens to be a PoC. It's fucked up.

8

bradbikes t1_j19apxj wrote

Yep. You have to be extremely careful in your curation of the training sets for ai algorithms and companies often aren't because it costs more. And there's very little transparency about it.

Example one company basically just uploaded pictures labeled from the Internet to a predictive ai to help find criminals. The problem being guess which group of people is massively overrepresented with online pictures labeled 'criminal'.

3

sanspoint_ t1_j19s03a wrote

> guess which group of people is massively overrepresented with online pictures labeled 'criminal'.

I'm gonna take a wild guess and say... black men?

1

bradbikes t1_j1aci5p wrote

Bingo. There were others that had like 100% accuracy with white people but couldn't differentiate between black people causing multiple false arrests.

And all of this is assuming a perfect facial recognition system would even be considered a good idea in a free society, something a lot of people would dispute I think.

2