2025-01-14
Moreover, researchers have found that people using AI tools can succumb to “automation bias,” a tendency to blindly trust decisions made by powerful software, ignorant to its risks and limitations. https://www.washingtonpost.com/ ...
Washington Post
An investigation finds 15 police departments across 12 US states have arrested suspects identified through facial recognition without having any other evidence
“law enforcement agencies across the nation are using the [AI] tools in a way they were never intended to be used: as a shortcut to finding and arresting suspects without other evi...
2023-11-15
In 2019, the National Institute of Standards and Technology, published a study revealing that many facial-recognition systems falsely identified Black and Asian faces between ten and a hundred times more frequently than Caucasian ones. | The New Yorker https://www.newyorker.com/...
New Yorker
A look at wrongful US arrests due to false positive facial recognition matches, and how “automation bias” can lead the police to ignore contradictory evidence
Eyal Press / New Yorker :