this post was submitted on 23 Sep 2023
298 points (92.6% liked)

Technology

59472 readers
3580 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

you are viewing a single comment's thread
view the rest of the comments
[–] BreakDecks@lemmy.ml 23 points 1 year ago (18 children)

The actual scary use case for AI porn is that if you can get 50 or more photos of the same person's face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make "generic looking" porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush's face onto the Playboy centerfold, only with automated distribution over the Internet...

[–] lloram239@feddit.de 18 points 1 year ago (17 children)

Using a LoRA was the old way, these days you can use Roop, FaceSwapLab or ReActor, which not only can work with as little as a single good photo, they also produce better locking results than LoRA. There is no time consuming training either, just drag&drog an image and you get results in a couple of seconds.

[–] pinkdrunkenelephants@sopuli.xyz 5 points 1 year ago (16 children)

So how will any progressive politician be able to be elected then? Because all the fascists would have to do is generate porn with their opponent's likeness to smear them.

Or even worse, deepfake evidence of rape.

Or even worse than that, generate CSAM with their likeness portrayed abusing a child.

They could use that to imprison not only their political opponents, but anyone for anything, and people would think whoever is being disappeared this week actually is a pedophile or a rapist and think nothing of it.

Actual victims' movements would be chopped off at the knee, because now there's no definitive way to prove an actual rape happened since defendants could credibly claim real videos are just AI generated crap and get acquitted. No rape or abuse claims would ever be believed because there is now no way to establish objective truth.

This would leave the fascists open to do whatever they want to anybody with no serious consequences.

But no one cares because they want AI to do their homework for them so they don't have to think, write, or learn to be creative on their own. They want to sit around on their asses and do nothing.

[–] Liz@midwest.social 4 points 1 year ago (1 children)

We're going to go back to the old model of trust, before videos and photos existed. Consistent, coherent stories from sources known to be trustworthy will be key. Physical evidence will be helpful as well.

[–] pinkdrunkenelephants@sopuli.xyz 2 points 1 year ago (1 children)

But then people will say "Well how do we know they're not lying?" and then it's back to square 1.

Victims might not ever be able to get justice again if this garbage is allowed to continue. Society's going so off-track.

[–] Castigant@lemm.ee 1 points 1 year ago (1 children)

How often does video evidence of rape exist, though? I don't think this really changes anything for most victims.

[–] pinkdrunkenelephants@sopuli.xyz 2 points 1 year ago* (last edited 1 year ago)

See Stuebenville, Ohio where the dumb motherfuckers date raped a girl and put the video on Facebook.

People do shit like that.

load more comments (14 replies)
load more comments (14 replies)
load more comments (14 replies)