this post was submitted on 26 Jan 2024
290 points (87.4% liked)

Technology

59219 readers
3314 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

you are viewing a single comment's thread
view the rest of the comments
[–] BeefPiano@lemmy.world 69 points 9 months ago (7 children)

I wonder if this winds up with revenge porn no longer being a thing? Like, if someone leaks nudes of me I can just say it’s a deepfake?

Probably a lot of pain for women from mouth breathers before we get there from here .

[–] SnotFlickerman@lemmy.blahaj.zone 56 points 9 months ago* (last edited 9 months ago)

I mean, not much happened to protect women after The Fappening, and that happened to boatloads of famous women with lots of money, too.

Arguably, not any billionaires, so we'll see I guess.

[–] thantik@lemmy.world 18 points 9 months ago (3 children)

This has already been a thing in courts with people saying that audio of them was generated using AI. It's here to stay, and almost nothing is going to be 'real' anymore unless you've seen it directly first-hand.

[–] TropicalDingdong@lemmy.world 54 points 9 months ago (1 children)

first-hand.

the first-hand in question:

[–] 800XL@lemmy.world 4 points 9 months ago* (last edited 9 months ago) (1 children)

If shitty non-real person AI-generated image deformity porn with body parts like this image isn't real, I bet it will be. There, you're all welcome.

[–] Petter1@lemm.ee 5 points 9 months ago

It’s already a thing? I mean, you know, rule 34

[–] sunbeam60@lemmy.one 2 points 9 months ago

Thereby furthering erosion of our democracies and continuing the slide into Putin-confusion.

[–] makyo@lemmy.world 1 points 9 months ago

We need trustworthy sources of news more than ever

[–] TwilightVulpine@lemmy.world 9 points 9 months ago (2 children)

Why would it make revenge porn less of a thing? Why are so many people here convinced that as long people say it's "fake" it's not going to negatively affect them?

The mouth breathers will never go away. They might even use the excuse the other way around, that because someone could say just about everything is fake, then it might be real and the victim might be lying. Remember that blurry pictures of bigfoot were enough to fool a lot of people.

Hell, even others believe it is fake, wouldn't it still be humilliating?

[–] Aethr@lemmy.world 9 points 9 months ago (2 children)

I think you're underestimating the potential effects of an entire society starting to distrust pictures/video. Yeah a blurry Bigfoot fooled an entire generation, but nowadays most people you talk to will say it's doctored. Scale that up to a point where literally anyone can make completely realistic pics/vids of anything in their imagination, and have it be indistinguishable from real life? I think there's a pretty good chance that "nope, that's a fake picture of me" will be a believable, no question response to just about anything. It's a problem

[–] TwilightVulpine@lemmy.world 0 points 9 months ago (1 children)

There are still people to believe in Bigfoot and UFOs, there's still people falling for hoaxes every day. To the extent that distrust is spreading, it's not manifested as widespread reasonable skepticism but the tendency to double down on what people already believe. There are more flat earthers today than there were decades ago.

We are heading to a point that if anyone says deepfake porn is fake, regardless of reasons and arguments, people might just think it's real just because they feel like it might be. At this point, this isn't even a new situation. Just like people skip reputable scientific and journalistic sources in favor of random blogs that validate what they already believe, they will treat images, deepfaked or not, much in the same way.

So, at best, some people might believe the victim regardless, but some won't no matter what is said, and they will treat them as if those images are real.

[–] daltotron@lemmy.world 2 points 9 months ago

This strikes me as correct, it's kind of more complicated than just the blanket statement of "oh, everyone will have too calloused of a mind to believe anything ever again". People will just try to intuit truth from surrounding context in a vacuum, much like how they do with our current every day reality where I'm really just a brain in a vat or whatever.

[–] eatthecake@lemmy.world -2 points 9 months ago (1 children)

I hope someone sends your mom a deepfake of you being dismembered with a rusty saw. I'm sure the horror will fade with time.

[–] Aethr@lemmy.world 2 points 9 months ago

What a horrible thing to wish on a random person on the internet. Maybe take a break on being so reactionary, jesus

[–] fine_sandy_bottom@discuss.tchncs.de 1 points 9 months ago (1 children)

The default assumption will be that a video is fake. In the very near future you will be able to say "voice assistant thing show me a video of that cute girl from the cafe today getting double teamed by robocop and an ewok wearing a tu-tu". It will be so trivial to create this stuff that the question will be "why were you watching a naughty video of me" rather than "omg I can't believe this naughty video of me exists".

[–] eatthecake@lemmy.world 1 points 9 months ago (1 children)

The mouth breathers will never go away. You're the mouth breather.

Name calling. Real classy.

[–] Cheskaz@lemmy.world 8 points 9 months ago

Australia's federal legislation making non-consensual sharing of intimate images an offense includes doctored or generated images because that's still extremely harmful to the victim and their reputation.

[–] JoBo@feddit.uk 7 points 9 months ago

Why do you think "there" is meaningfully different from "here"?

[–] TORFdot0@lemmy.world 5 points 9 months ago

A deep fake is still humiliating