this post was submitted on 19 Sep 2023
66 points (90.2% liked)
Lemmy
12524 readers
19 users here now
Everything about Lemmy; bugs, gripes, praises, and advocacy.
For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI will make this redundant
It will also make it a battle of attrition. Because now we’re not only using AI to block CSAM; Trolls are using AI to generate CSAM.
The issue is that these tools typically work by hashing the image (or a specific section of the image) and checking it against a database of known CSAM. That way you never actually need to view the file to compare it to the list. But with AI image generation, that list of known CSAM is essentially useless because trolls can just generate new images.
Bingo, that's the issue. With an endless supply of fresh content, hash checking is dead
On the other hand, if the people who want those images can satisfy their urges using AI fakes, that could mean less spreading of images of actual abuse. It might even mean less abuse happening.
However, because they're terrible people, I have to suspect that's not the case.
People who create the content and insane monsters, but a LOT of actual pedos (vs predators looking for a power play) are disgusted by their preference. I know a ton of them look to cartoons already for stimulation, so I think AI content could draw more people away from actual material. Hopefully if demand reduces there will be less creation of new real content as the potential profits fall more proportionate to the risk.