this post was submitted on 09 Oct 2023
221 points (87.5% liked)

Technology

59323 readers
4891 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A nightmare scenario previously only imagined by AI researchers, where AI image generators accidentally spit out non-consensual pornography of real people, is now reality.

you are viewing a single comment's thread
view the rest of the comments
[–] kibiz0r@midwest.social 35 points 1 year ago (1 children)

Yet another reason that we cannot allow ML companies to set a precedent that "it's fine to use non-consensual training data, because the model only 'learns' from it and never reproduces an exact replica of any single input".

Also, this was not surprising:

Dillon said that DreamGF has a team of between 20-25 developers, mostly in Bulgaria, and that they previously worked at an NFT company.

[–] lloram239@feddit.de 1 points 1 year ago

The cat is out of the bag. Everything what those services do is like child's play compared to the stuff you can do at home on your own PC. This ain't ever going to stop, it's just going to get more powerful.

I recommend playing around with Bing Image Creator/DALLE-3 for a bit to get an idea of what is possible these days. It will filter celebrities and NSFW content of course, but the amount of flexibility it has and quality it produces is mind boggling.