this post was submitted on 25 Oct 2023
76 points (82.2% liked)
Technology
59427 readers
3469 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think the biggest worry for me at this point is what the AI trained on in order to depict these images. It's not victimless if it needs victims of child abuse to train on
Edit: really fucking weird I'm getting down voted for being against AI training on child porn. I'm willing to go down with that ship.
It knows what naked people look like, and it knows what children look like. It doesn't need naked children to fill in those gaps.
Also, these models are trained with images scraped from the clear net. Somebody would have to had manually added CSAM to the training data, which would be easily traced back to them if they did. The likelihood of actual CSAM being included in any mainstream AI's training material is slim to none.
Defending AI generated child porn is a weird take, and the support you're receiving is even more concerning
I'm not defending it, dipshit. I'm explaining how generative AI training works.
The fact that you can't see that is what's really concerning.