this post was submitted on 08 Aug 2023
132 points (99.3% liked)

Privacy

31934 readers
759 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

According to The New York Times, this incident is the sixth recent reported case where an individual was falsely accused as a result of facial recognition technology used by police, and the third to take place in Detroit.

all 7 comments
sorted by: hot top controversial new old
[–] brainrein@feddit.de 48 points 1 year ago

"Someone always looks like someone else."

That’s so true. In Berlin, Germany, a boar became breaking news and public warning by looking like a lion!

[–] teegus@sh.itjust.works 48 points 1 year ago (1 children)

Yeeahhh if you don't do anything wrong you have nothing to fear from mass surveillance, right? Right?

[–] autotldr@lemmings.world 16 points 1 year ago

This is the best summary I could come up with:


According to The New York Times, this incident is the sixth recent reported case where an individual was falsely accused as a result of facial recognition technology used by police, and the third to take place in Detroit.

Advocacy groups, including the American Civil Liberties Union of Michigan, are calling for more evidence collection in cases involving automated face searches, as well as an end to practices that have led to false arrests.

A 2020 post on the Harvard University website by Alex Najibi details the pervasive racial discrimination within facial recognition technology, highlighting research that demonstrates significant problems with accurately identifying Black individuals.

Further, a statement from Georgetown on its 2022 report said that as a biometric investigative tool, face recognition "may be particularly prone to errors arising from subjective human judgment, cognitive bias, low-quality or manipulated evidence, and under-performing technology" and that it "doesn’t work well enough to reliably serve the purposes for which law enforcement agencies themselves want to use it."

The low accuracy of face recognition technology comes from multiple sources, including unproven algorithms, bias in training datasets, different photo angles, and low-quality images used to identify suspects.

Reuters reported in 2022, however, that some cities are beginning to rethink bans on face recognition as a crime-fighting tool amid "a surge in crime and increased lobbying from developers."


I'm a bot and I'm open source!

[–] inspxtr@lemmy.world 5 points 1 year ago

Reuters reported in 2022, however, that some cities are beginning to rethink bans on face recognition as a crime-fighting tool amid "a surge in crime and increased lobbying from developers."

Sounds to me like there’s a deeper issue here in these cities (probably society in general) that needs to be tackled at its root why people are turning to crime. Is it because they couldn’t find reliable jobs to support themselves and their families? Is it related to drug abuse issues that might be rooted in or coupled with mental health issues?

If these hold some truth, then cities need to take a look at them much closely, invest in the people and organizations that can help solve them, rather than overly investing in technologies and enforcement that not only do not solve them, but may even further exacerbate these systemic issues.

[–] uriel238@lemmy.blahaj.zone 4 points 1 year ago

US police departments continue to use the tech despite low accuracy and obvious mismatches.

This is super common. US law enforcement loves, loves, loves $2 drug tests that react to pretty much anything (including glazed donut sugar and human ashes out of an urn). It serves them as a common method to establish probable cause and end-run around the forth amendment to the Constitution of the United States.

So yeah, inaccurate facial recognition gives them grounds to harass innocent Americans and search them to see what crimes they have committed, or they can be pressured to commit (e.g. resisting arrest)