this post was submitted on 02 Jul 2025
363 points (97.4% liked)
Technology
72320 readers
4329 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.
If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.
Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.
It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.
This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.
It's sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.
Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don't understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.
Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It's LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It's criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.
Can you please use words by their meaning?
Also I'll have to be blunt, but - every human has their own sexuality, with their own level of "drive", so to say, and their dreams.
And it's absolutely normal to dream of other people. Including sexually. Including those who don't like you. Not only men do that, too. There are no thought crimes.
So talking about that being easier or harder you are not making any argument at all.
However. As I said elsewhere, the actions that really harm people should be classified legally and addressed. Like sharing such stuff. But not as making child pornography because it's not, and not like sexual exploitation because it's not.
It's just that your few posts I've seen in this thread seem to say that certain kinds of thought should be illegal, and that's absolute bullshit. And laws shouldn't be made based on such emotions.
~~"thought crime"? And you have the balls to talk about using words "by their meaning"?~~
This is a solid action with a product to show for it, not a thought, which happens to impact someone's life negatively without their consent, with potentially devastating consequences for the victim. ~~So, can you please use words by their meaning?~~
Edit: I jumped the gun when I read "thought crime", effectively disregarding the context. As such, I'm scratching the parts of my comment that don't apply, and leaving the ones that do apply (not necessarily to the post I was replying to, but to the whole thread).
The author of those comments wrote a few times what in their opinion happens in the heads of others and how that should be prevented or something.
Can you please stop interpreting my words exactly the way you like? That's not worth a gram of horse shit.
Yes I can, moreso after your clarification. I must have misread it the first time. Sorry.
Sorry for my tone too, I get dysphoric-defensive very easily (as have been illustrated).
I don’t know where you’re getting this “thought crime” stuff. They’re talking about boys distributing deepfake nudes of their classmates. They’re not talking about individuals fantasizing in the privacy of their own homes. You have to read all of the words in the sentences, my friend.