this post was submitted on 23 Dec 2023
25 points (82.1% liked)

Canada

7206 readers
342 users here now

What's going on Canada?



Communities


🍁 Meta


πŸ—ΊοΈ Provinces / Territories


πŸ™οΈ Cities / Local Communities


πŸ’ SportsHockey

Football (NFL)

  • List of All Teams: unknown

Football (CFL)

  • List of All Teams: unknown

Baseball

Basketball

Soccer


πŸ’» Universities


πŸ’΅ Finance / Shopping


πŸ—£οΈ Politics


🍁 Social and Culture


Rules

Reminder that the rules for lemmy.ca also apply here. See the sidebar on the homepage:

https://lemmy.ca


founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] swordgeek@lemmy.ca 15 points 11 months ago (1 children)

The database started out empty. They added all of the content. The filtering should have been part of the intake process, not after the fact. Image recognition has beem used to detect CP for many years now.

They could have and should gave stopped these images from getting into the dataset at all, but they didn't. Consequently, people who were victimized as children are having the exploitive images of them being used to generate new (synthetic) child porn.

[–] Grimy@lemmy.world 3 points 11 months ago

They did run filters. The group that found the new ones made a completely new stronger filter that is better at detecting it. You can't blame them for not using technology that just wasn't available at the time. They also pulled the whole dataset the moment the group alerted them to it and removed them.