this post was submitted on 16 Oct 2023
83 points (92.8% liked)

Technology

59219 readers
3235 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.

all 31 comments
sorted by: hot top controversial new old
[–] rbn@feddit.ch 97 points 1 year ago* (last edited 1 year ago) (1 children)

From my perspective deep fakes will lead to a short but massive peak of harassment until everyone is aware of the technology and its capabilities. Once the technology reaches the mainstream and everyone is able to generate such content with ease, people will just stop caring. If these videos are everywhere, it's easy to play it off as a fake. It might even help victims of actual revenge porn. Virtual nudity will become less of a deal, probably even in real life.

From my perspective the bigger issue of deep fakes is news. We already have a huge issue with lies on social media and even TV and newspapers today and once we can no longer trust what we see it will be incredibly hard to build up trust for any sources.

Fake videos of politicians being spread to harm their credibility, fake videos of war crimes to justify an attack. Or vice versa if there's an authentic video of a crime the offenders will just deny the authenticity. But in contrast to Trump's "fake news" claims today, it will be more or less impossible for normal people to fake check anything.

[–] dudewitbow@lemmy.ml 17 points 1 year ago (1 children)

Although not related to porn, a lot of scam services that operate in India already use it as a defense. Its extremely hard to get someone in the field in trouble because you need evidence to raid, and it cant be video nor audio because they claim that the said medium is a deepfake.

[–] CatZoomies@lemmy.world 70 points 1 year ago* (last edited 1 year ago) (4 children)

This is a sad article to read. I'm not a woman nor am I young adult growing up with all this technology that can be leveraged against me. Could you imagine being a junior high or high school student, and having an anonymous classmate creating deepfake porn of you using your yearbook photo? And the children in your class gossiping about you, sharing your porn video/photo online with their friends, and enduring that harassment? It's already well-documented what damages that too much pornography causes on our psychological development, now imagine the consumer of this content being around the victim. That harassment can get so much worse.

I can't even begin to fathom what kind of psychological damage this will cause to the youth. I feel for women everywhere - this is a terrible thing people are doing with this technology. I can't imagine raising a daughter in this environment and trying to help her navigate this problem when some asshole creates deepfake porn of her. My niece is currently getting bullied in school - what if her bullies use these tools against her? This just makes my blood boil.

It's bad enough that since social media has risen and captured the attention span of kids and teenagers, that there is a well-defined correlation with an increase in suicide rates since 2009 (the year Twitter first came out). https://www.health.com/youth-suicide-rate-increase-cdc-report-7551663 . Now, a nonconsensual AI-generated porn era to navigate.

These are dangerous times. This opens persons up for attack, and regulation to increase friction to access these tools is one of the next most important steps to take. Granted, outright bans never work (as the persistent ones will always get their hands on it), but we need to put controls into place to limit access to this. Then we can remediate the root cause to these problems (e.g., proper systemic education, teaching a modified sexual education in schools to address things like consent, etc.).

EDIT:

Wanted to also add after I posted this, that a common prevalent argument I hear parroted by people is this:

  • People are gonna do this AI generation anyway. It'll get to the point that you won't be able to tell what's real or not, so women can just deny it. You can't prove it's real anyway, so why bother?

This is another way of saying "boys will be boys" and ignoring the problem. The problem is harrassment and violence against women.

[–] damndotcommie@lemmy.basedcount.com 11 points 1 year ago (2 children)

Where did all the replies to this post go? There was an entire discussion that is now gone, and nothing in the modlog.

[–] Kalcifer@lemm.ee 8 points 1 year ago

After some testing, It might be that the parent commenter just deleted their comment which nuked all the child comments. I can't rememeber if this is what Reddit does. I think it just sais "Deleted by creator", but keeps the children. Could certainly be wrong, though.

[–] Kalcifer@lemm.ee 5 points 1 year ago* (last edited 1 year ago) (1 children)

Well, that doesn't bode well.

[–] SatansMaggotyCumFart@lemmy.world 6 points 1 year ago (2 children)

I’ve found that if done one deletes their comment than everything below it disappears.

[–] Kalcifer@lemm.ee 6 points 1 year ago* (last edited 1 year ago) (1 children)

Yup it appears that our entire comment chain got nuked. So it is now confirmed that if you delete the parent, then all children get removed as well.


For any reading this message, the context is that we tested it by me replying to OP's previous comment, then OP responding to me, then I deleted my comment to see if their comment also got deleted.

From my testing it only removes them, but you should be able to go into them again by clicking on the reply in your inbox.

load more comments (3 replies)
[–] alienanimals@lemmy.world 25 points 1 year ago (1 children)

AI and deepfakes aren't going to stop. Schools need to get with the times rather than pretending like it's the year 1960.

Teachers should be able to deliver meaningful punishments to students. If someone gets caught passing these around, then that person should catch some flak. And none of that punishing the victim and the bully like most schools do.

[–] Yuki@kutsuya.dev -2 points 1 year ago (1 children)

Schools won't care. They never did and never will. This will just be a new era for bullies to use.

They should actually teach about it. Maybe even teach how to use it.

[–] Djtecha@lemm.ee 2 points 1 year ago

Well then start making videos of the staff that doesn't care.

[–] wantd2B1ofthestrokes@discuss.tchncs.de 19 points 1 year ago (1 children)

That’s disgusting. Which particular sites are out of control?

[–] Heratiki@lemmy.ml 8 points 1 year ago* (last edited 1 year ago)

How would you police this without direct abuse?

It’s pretty easy to spot deep fakes, even now. The type of porn being created in deep fakes are just too unbelievable when it comes to actors and actresses. They’re not deep faking intimate love porn, it’s nearly always straight up deep hardcore pornography that is being made when deep fakes are involved. I feel like everything described that is so evil is just a straw man argument. Hell anyone that believes deep hardcore pornography is what just happens in reality is a moron. The amount of bullshit incest porn on these same websites is just bonkers. That being said I can see how it can affect some people.

But guess what? Humans tend to look similar so how do you go stop it when you don’t know if it’s real or fake? How crazy easy will it be to create yet another advantage to those in power/financial success? Examples:

Politician is seeing a prostitute and abusing his status to do so. The prostitute records a secret sex tape of him raping her and threatening to have her arrested if she doesn’t submit to what he wants. This video goes public. Politician claims it’s a deep fake and prostitute is arrested anyway. Or the reverse. A prostitute deep fakes the video and threatens the politician. The politician just had information coming out of him glancing at another woman than his wife before the deep fake and so the populace just sides with the prostitute and the politician is arrested.

Or how about a woman who looks just like Taylor Swift decides she wants to work in pornography. Her likeness is immediately noticed and it’s part of her popularity but not billed as such. T swizzle claims it’s a deep fake to disparage her and the porn actress is ruin if not sued into oblivion.

So many scenarios could go either way. You can’t ban the technology because you’ll never be able to legitimately be able to know which is which. And just like cryptography banning it will not limit access to those who would use it lawfully.

So what’s the solution? Get over the lunacy of the whole event? What options do we really have? And being we don’t have many/any options all we’re doing is sending clicks to news sites who have nothing else to write about. I’m not saying it’s not a problem, just not seeing a solution and don’t see a need to continually beat a dead horse.

[–] trackcharlie@lemmynsfw.com 1 points 11 months ago

Society's views on sexuality will change before we will EVER get a serious handle on deepfakes. If you're rich and can afford the lawyers, go ham and sue, otherwise, time to just accept that humans are animals and animals fuck.

Whether or not someone is or is not in a porn video is less important than whether or not they can do whatever job or task they've been given.

Religious puritanical morons and prudes need to stfu and get over it, the victims need to cope with reality that this is never going away and they can spend their entire life and fortune on 'finding the one who did this' or just move on and put energy into something worthwhile.

Even complaining about this is hysterically moronic. The 'big threat' is fake porn.

Fixing the child care system so that child abuse, emotional, physical and sexual gets reduced even 1% would be immensely more worthwhile a task than literally any ending of a pursuit against a technology that is open source and widely available, not to mention even if it was made illegal in your country, good luck actually enforcing a law like that without going 110% dystopia with locked down internet that would make current chinese life look like a kind big brother system.