this post was submitted on 17 Sep 2023
1 points (100.0% liked)
Chat
7499 readers
82 users here now
Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The solution is to use an already existing software product that solves this, like CloudFlare’s CSAM Detection. I know people on the fediverse hate big companies, but they’ve solved this problem already numerous times before. They’re the only ones allowed access to CSAM hashes, lemmy devs and platforms will never get access to the hashes (for good reason).
Wait… why is no access to csam hashes a good thing? Wouldn’t it make it easier to detect if hashes were public?! I feel like I’m missing something here…
Giving access to CSAM hashes means anyone wanting to avoid detection simply has to check what they’re about to upload against the db. If it matches then they simply modify the image until it doesn’t. It’s literally guaranteed to make the problem worse, not better.