this post was submitted on 18 Jun 2024
245 points (100.0% liked)
Privacy
31872 readers
524 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
Chat rooms
-
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is what I guessed the other day when a post here didn't clarify what the censorship meant.
While I'm not a fan of this stupid regulation, it doesn't sound like being the armageddon that turns e2ee into ashes.
(Given that Signal doesn't like it, I might be wrong though.)
As long as we trust, say, Signal, it will possibly be able to do the scan without sending a good chunk of the image data that the user is sending. URLs can be hashed before sending it to the scanner.
The remaining piece for privacy is to use open source and to guarantee that the binaries are free of modification from the original. This problem always existed on the Apple ecosystem btw.
How about the false positives? You want your name permanently associated with child porn because someone fucked up and ruined your life? https://www.eff.org/deeplinks/2022/08/googles-scans-private-photos-led-false-accusations-child-abuse
The whole system is so flawed that it has like 20-25% success rate.
Or how about this system being adopted for anything else? Guns? Abortion? LGBT related issues? Once something gets implemented, it's there forever and expansion is inevitable. And each subsequent government will use it for their personal agenda.
They say they the images are merely matched to pre-determined images found on the web. You're talking about a different scenario where AI detects inappropriate contents in an image.
Matched using perceptual hash algorithms that have an accuracy between 20% and 40%.
Is there a source stating that they're going to require these?
Unfourtunately, I couldn't find a source stating it would be required. AFAIK it's been assumed that they would use perceptual hashes, since that's what various companies have been suggesting/presenting. Like Apple's NeuralHash, which was reverse engineered. It's also the only somewhat practical solution, since exact matches would be easily be circumvented by changing one pixel or mirroring the image.
Patrick Breyer's page on Chat Control has a lot of general information about the EU's proposal.
Stupid regulation, honestly. Exact matches are implementable but further than that... Aren't they basically banning e2ee at this point?
Now I see why Signal will close in EU.