this post was submitted on 08 Dec 2023
360 points (93.1% liked)

Technology

59596 readers
4928 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] cosmicrookie@lemmy.world 10 points 11 months ago* (last edited 11 months ago) (10 children)

~~Hoe~~ How can this be legal though?

[–] Daxtron2@startrek.website 6 points 11 months ago (6 children)

The same way that photo shopping someone's face onto a pornstar's body is.

load more comments (6 replies)
[–] phoneymouse@lemmy.world 6 points 11 months ago

I guess free speech laws protect it? You can draw a picture of someone else nude and it isn’t a violation of the law.

load more comments (8 replies)
[–] damnfinecoffee@lemmy.world 9 points 11 months ago (1 children)

Reminds me of Arthur C Clarke's The Light of Other Days. There's a technology in the book that allows anyone to see anything, anywhere, which eliminates all privacy. Society collectively adjusts, e.g. people masturbate on park benches because who gives a shit, people can tune in to watch me shower anyway.

Although not to the same extreme, I wonder if this could similarly desensitize people: even if it's fake, if you can effectively see anyone naked... what does that do to our collective beliefs and feelings about nakedness?

[–] flamehenry@lemmy.world 10 points 11 months ago (1 children)

It could also lead to a human version of "Paris Syndrome" where people AI Undress their crush, only to be sorely disappointed when the real thing is not as good.

load more comments (1 replies)
[–] andrew_bidlaw@sh.itjust.works 8 points 11 months ago (2 children)

It was inevitable. And it tells more about those who use them.

I wonder how we'd adapt to these tools being that availiable. Especially in blackmail, revenge porn posting, voyeuristic harassment, stalking etc. Maybe, nude photoes and videos won't be seen as a trusted source of information, they won't be any unique worth hunting for, or being worried about.

Our perception of human bodies was long distorted by movies, porn, photoshop and subsequent 'filter-apps', but we still kinda trusted there was something before effects were applied. But what comes next if everything would be imaginary? Would we stop care about it in the future? Or would we grow with a stunted imagination since this stimuli to upgrade it in early years is long gone?

There're some useless dogmas around our bodies that could be lifted in the process, or a more relaxed trend towards clothing choices can start it's wsy. Who knows?

I see bad sides to it right now, how it can be abused, but if these LLMs are to stay, what're the long term consequencies for us?

load more comments (2 replies)
[–] Corkyskog@sh.itjust.works 8 points 11 months ago (2 children)

What nude data were these models trained on?

This seems like another unhealthy thing that is going to pervert people's sense of what a normal body looks like.

load more comments (2 replies)
[–] randon31415@lemmy.world 8 points 11 months ago

Back in the day, cereal boxes contain "xray glasses". I feel like if those actually worked as intended, we would have already had this issue figured out.

[–] onlinepersona@programming.dev 8 points 11 months ago (1 children)

I can't help but think of nudibranches when I read "nudify".

[–] Nylevie@lemmy.blahaj.zone 6 points 11 months ago (1 children)

Someone should make an AI tool that can turn women into nudibranches, it wouldn't be as creepy

load more comments (1 replies)
[–] fne8w2ah@lemmy.world 6 points 11 months ago

That's the 21st century equivalent to those ceral box x-ray glasses!

[–] weew@lemmy.ca 6 points 11 months ago* (last edited 11 months ago)

I doubt it produces actual nudes, it probably just photoshops a face onto a random porn star

[–] Quexotic@infosec.pub 5 points 11 months ago

Just created a Dall-e image of a woman. AI undresser instantly undressed it.

Kinda chilling.

[–] PandaPikachu@lemmy.world 5 points 11 months ago

It would be interesting to know how many people are using it for themselves. I'd think it would open up next level catfishing. Here's an actual pic of me, and here's a pic of what I might look like naked. I'm sure some people with photoshop skills we're already doing that to a certain extent, but now it's accessible to everyone.

[–] A_Random_Idiot@lemmy.world 5 points 11 months ago* (last edited 11 months ago)

What kind of mentally unhinged active threat to society would even think of creating such a thing, much less use such a thing?

edit Boy I wish I knew who the downvoters were, i bet the FBI would love to have a gander at their computers.

load more comments
view more: ‹ prev next ›