this post was submitted on 03 Dec 2023
432 points (94.8% liked)

News

23259 readers
3316 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

you are viewing a single comment's thread
view the rest of the comments
[–] calypsopub@lemmy.world 25 points 11 months ago (5 children)

So as a grown woman, I'm not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That's more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won't dare to share it outside their sick incels club.

[–] WoahWoah@lemmy.world 62 points 11 months ago (2 children)

That's fine and well. Except they are videos, and it is very difficult to prove they aren't you. And the internet is forever.

This isn't like high school when you went to high school.

Agreed on your last paragraph.

[–] MargotRobbie@lemmy.world 16 points 11 months ago (2 children)

Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

That's the silver lining of this entire ordeal.

Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

[–] finestnothing@lemmy.world 16 points 11 months ago (1 children)

That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

Plus, something I've already seen happen is someone says a nude is fake and are then told they have to prove that it's fake to get people to believe them... which is very hard without sharing an actual nude that has something unique about their body

[–] derpgon@programming.dev -1 points 11 months ago (2 children)

The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

[–] Llewellyn@lemm.ee 5 points 11 months ago

You can ask an AI to draw a blurred version of the tattoo. Or to mask the tattooed area with, I don't know, piece of clothes or something.

[–] WoahWoah@lemmy.world 2 points 11 months ago* (last edited 11 months ago) (1 children)

Yes I'm sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

HR probably wouldn't even allow a conversation about it. That person just never gets called back.

And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

The entire thing is damaging and ugly.

[–] derpgon@programming.dev 1 points 11 months ago (1 children)

If you are already an employee, then they, will want to keep you and look into the matter.

If you are not an employee yet - is HR really looking up porn of everyone?

[–] WoahWoah@lemmy.world 1 points 11 months ago (1 children)

Yes, HR Googles your name. 🙄

[–] derpgon@programming.dev 1 points 11 months ago

I am pretty sure people who do porn use pseudonyms anyway. If HR thinks the people use their real name and spread their porn on the internet, they are dumb for not realizing it's fake. HR being HR as always.

[–] zbyte64@lemmy.blahaj.zone 2 points 11 months ago* (last edited 11 months ago)

Seems we're partially applying market dynamics of supply and demand. Simply assuming the "surplus" supply of deep fakes will decrease their value ignores the fact that the demand is still there. Instead what we get is new value opportunities in the arms race of validating and distributing deep fakes.

[–] calypsopub@lemmy.world 13 points 11 months ago (1 children)

Why should they have to expend any energy proving it's not them?

[–] toonicycle@lemmy.world 4 points 11 months ago

I mean they obviously shouldn't have to, but if nude photos of you got leaked in your community, people would start judging you negatively, especially if you're a young woman. Also in these cases where they aren't adults it would be considered cp.

[–] ILikeBoobies@lemmy.ca 34 points 11 months ago (1 children)

So they do it and share it around to slut shame you

You try to find a job and they find porn of you

It’s a lot worse than you’re making it out to be when it’s not you that gets to make that decision

[–] DogMuffins@discuss.tchncs.de 9 points 11 months ago (2 children)

IMO the days of searching for porn of prospective employees are over. With the advent of AI generated porn, what would be the point of that?

[–] ILikeBoobies@lemmy.ca 11 points 11 months ago

People are quicker to judge than they are to reason

[–] Couldbealeotard@lemmy.world 11 points 11 months ago (1 children)

There are so many recent articles linked on Lemmy about people losing their job over making porn. The days of losing jobs over porn is now more than ever.

[–] DogMuffins@discuss.tchncs.de 0 points 11 months ago

Seriously? Maybe we don't read the same stuff but that's not something I've noticed.

I just can't imagine how that's possible. I wish someone would fire me over porn so I could sue them for unfair dismissal as well as defamation and or libel.

[–] atzanteol@sh.itjust.works 27 points 11 months ago (2 children)

You may not be representative of teenage girls.

[–] Basil@lemmings.world 4 points 11 months ago* (last edited 11 months ago)

So as a grown woman

Right? Literally not what's being discussed. Obviously they'll be more mature and reasonable about it. Teenagers won't be

[–] calypsopub@lemmy.world 0 points 11 months ago (1 children)

I wasn't very representative even when I WAS a teenager. I was bullied quite a bit, though.

[–] atzanteol@sh.itjust.works 5 points 11 months ago* (last edited 11 months ago)

And can you imagine those bullies creating realistic porn of you and sharing it with everyone at school? You may have been strong enough to endure that - but it's pretty unrealistic to expect everyone to be able to do so. And it's not a moral failing if somebody is unable to. This is the sort of thing that leads to suicides.

[–] ExLisper@linux.community 25 points 11 months ago (1 children)

I don't think the problem is that the girls and ashamed of the fake porn. The problem is not even that other kids will believe it. The problem is that kids will use it to mock, bully and ostracise them. It's not being shared as 'OMG, you're so hot I made fake sex tape with you, marry me". It's being shared as "you're a slut that does porn, everyone thinks you're a bitch, go kill yourself'.

[–] calypsopub@lemmy.world 8 points 11 months ago (1 children)

I see your point. In that way it's just like any other bullying, though more personal. Unfortunately, society hasn't done a good job of coming up with workable solutions for bullying. In this case, dragging the culprit behind the bleachers and letting the girls take turns kicking him in the nuts would be my go-to, but you can't do that sort of thing anymore.

[–] zbyte64@lemmy.blahaj.zone 4 points 11 months ago

You response highlights how the victims needs the power of community to respond appropriately, and how society excuses some forms of violence (involuntary porn) and not others (women getting retribution).

[–] foo@programming.dev 13 points 11 months ago (2 children)

What if the deep fake was so real it was hard to tell? Now if the deep fake was highly invasive and humiliating? Can you see the problem?

[–] DogMuffins@discuss.tchncs.de 1 points 11 months ago

I think that the point this comment is trying to make is that because it has become so easy to make these images, their existence is not very meaningful. All deep fakes are very realistic. You can't tell fakes from originals.

Like as an adult, if I saw an "offensive" image of a co-worker, my first assumption would be that it's probably AI generated, my first thought would be "which asshole made this image" rather than "I can't believe my co-worker did [whatever thing]".