this post was submitted on 22 Dec 2025
886 points (99.0% liked)

News

33862 readers
2417 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
 

A 13-year-old girl at a Louisiana middle school got into a fight with classmates who were sharing AI-generated nude images of her

The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.

Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.

“That’s when I got angry,” the eighth grader recalled at her discipline hearing.

Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

you are viewing a single comment's thread
view the rest of the comments
[–] SinningStromgald@lemmy.world 266 points 1 week ago (10 children)

So AI images of underaged nude girls being reported to police does not warrant any form of investigation?

[–] smeg@infosec.pub 239 points 1 week ago (1 children)

And AI generation vendors get a free pass for generating child porn

[–] SinningStromgald@lemmy.world 105 points 1 week ago (2 children)
[–] cecilkorik@piefed.ca 52 points 1 week ago

That would affect the economy, and profits, which as we know are much more important than morals, so unfortunately, we must allow it.

[–] hopesdead@startrek.website 34 points 1 week ago (1 children)

The Trump Administration is trying to make preventing this illegal.

[–] RustyShackleford@piefed.social 22 points 1 week ago (2 children)

Easy answer, we make unflattering AI porn of Ivanka. Make it impossible for her dad to enjoy.

[–] fakeman_pretendname@feddit.uk 30 points 1 week ago

I fear you're underestimating his depravity.

[–] BarneyPiccolo@lemmy.today 3 points 1 week ago

He'll insist on being added to the text chain.

[–] riskable@programming.dev 76 points 1 week ago (6 children)

The article states that the police investigated but found nothing. The kids knew how to hide/erase the evidence.

Are we really surprised, though? Police are about as effective at digital sleuthing as they are at de-escalation.

[–] ininewcrow@lemmy.ca 38 points 1 week ago (1 children)

Unless they can pull out their gun and shoot at something or someone ... or tackle someone ... they aren't very good at doing anything else.

[–] cyberwitch@reddthat.com 17 points 1 week ago

Literally verbatim what an officer said when we couldn't get a hold of animal control and he got sent over instead...

[–] Buelldozer@lemmy.today 25 points 1 week ago

The article states that the police investigated but found nothing.

You should have kept reading.

"Ultimately, the weeks-long investigation at the school in Thibodaux, about 45 miles (72 kilometers) southwest of New Orleans, uncovered AI-generated nude images of eight female middle school students and two adults, the district and sheriff’s office said in a joint statement.”

[–] mic_check_one_two@lemmy.dbzer0.com 24 points 1 week ago* (last edited 1 week ago) (1 children)

The article later states that they continued investigating, and found ten people (eight girls and two adults) who were targeted with multiple images. They charged two boys with creating and distributing the images.

It’s easy to jump on the ACAB bandwagon, but real in-depth investigation takes time. Time for things like court subpoenas and warrants, to compel companies like Snapchat to turn over message and image histories (which they do save, contrary to popular belief). The school stopped investigating once they discovered the kids were using Snapchat (which automatically hides message history) but police continued investigating and got ahold of the offending messages and images.

That being said, only charging the two kids isn’t really enough. They should charge every kid who received the images and forwarded them. Receiving the images by itself shouldn’t be punished, because you can’t control what other people spontaneously send you… But if they forwarded the images to others, they distributed child porn.

[–] wheezy@lemmy.ml -1 points 1 week ago

At the end of the day, these are children, there is no punishment meaningful that ends with just these boys punished. Justice would be finding the source of who created these images. I'm honestly highly doubtful it was these kids alone. This really should bring into suspect any adult in the life of these boys. An investigation that stops at punishing children for child sexual abuse material is not at all a thorough investigation.

It's possible these boys were able to generate these images on their own (meaning not with help from anyone in their real life interactions). But, even if that was the case, the investigation should not stop there.

[–] pelespirit@sh.itjust.works 22 points 1 week ago

When the sheriff's department looked into the case, they took the opposite actions. They charged two of the boys who'd been accused of sharing explicit images — and not the girl.

[–] Zachariah@lemmy.world 16 points 1 week ago (1 children)

Oh, shit! Did they shoot the computer?

[–] drzoidberg@lemmy.world 23 points 1 week ago (1 children)
[–] Zachariah@lemmy.world 2 points 1 week ago

I think that one’s okay. It’s not black.

No it doesn't say that.

[–] Buelldozer@lemmy.today 47 points 1 week ago* (last edited 1 week ago) (1 children)

Your question was answered in the article but you clearly stopped at either the outrage bait headline or the outrage bait summary.

"Ultimately, the weeks-long investigation at the school in Thibodaux, about 45 miles (72 kilometers) southwest of New Orleans, uncovered AI-generated nude images of eight female middle school students and two adults, the district and sheriff's office said in a joint statement."

[–] echodot@feddit.uk 2 points 1 week ago (2 children)

That was the investigation by the police not the school.

What we're asking is why the school didn't investigate given that the police had already been contacted.

[–] lightnsfw@reddthat.com 8 points 1 week ago (1 children)

I mean, the police are the proper individuals to be investigating csam. The school bringing them in immediately would have been the correct action. School officials aren't trained to investigate crime.

[–] BarneyPiccolo@lemmy.today 4 points 1 week ago (1 children)

Perhaps the cops are the proper investigative arm, but the school system had an obligation to assist in that investigation, and not ignore it, then deny it, then cover it up.

The entire leadership of the school should be fired, and the principal should be prosecuted.

[–] logi@lemmy.world 2 points 1 week ago (1 children)

Because a school can't compell Snapchat to release "disappeared" images and chat logs. So perhaps in this case it was best left to the police.

[–] echodot@feddit.uk 4 points 1 week ago (1 children)

It wasn't left to the police she'd already gone to the police. It sounds from the story like the school did literally nothing at all.

Also you don't need to compel Snapchat to release the images they're 13-year-old boys they absolutely have permanent copies on their phones.

[–] yeather@lemmy.ca 1 points 1 week ago (2 children)

How can the school compel the boys to show the permanent copies then? I think you are overestimating the power of the school in this scenario.

[–] BarneyPiccolo@lemmy.today 2 points 1 week ago (1 children)

Saying there is nothing they can do is the standard cop-out for lazy administrators.

They are minors in school, under the legal supervision of the school. There are LOTS of things a school can do, and courts have been finding mostly on the side of schools for decades.

Without even trying, I can think of a dozen things the school could have done, including banning phones from the suspects until the investigation is over.

But they chose to do nothing, them punish the victim when she defended herself, after the school refused.

[–] yeather@lemmy.ca 2 points 1 week ago (1 children)

Banning phones during the investigation does not give the administration evidence to work with. Even if they took the phones, the school still couldn’t force the students to unlock them. The only way to get the evidence needed was through the police.

[–] BarneyPiccolo@lemmy.today 0 points 1 week ago* (last edited 1 week ago) (1 children)

Okay, then permanent expulsion.

[–] yeather@lemmy.ca 2 points 1 week ago (1 children)

Which you need evidence to do. Evidence the school could not get.

[–] Buelldozer@lemmy.today 2 points 1 week ago* (last edited 1 week ago) (1 children)

These commenters just want to be outraged. If schools were suddenly confiscating phones and forcing students to unlock them they'd be on here rabble rousing about First Amendment rights, how Schools are run like prisons, and how students aren't being respected.

You should know that the person you are going back and forth with has an exceptionally argumentative and unpleasant comment history.

[–] yeather@lemmy.ca 1 points 1 week ago

I know, I like to argue too. At least I did until I was banned from .ml for hurting their feelings.

[–] jj4211@lemmy.world 1 points 1 week ago

The school doesn't even need to do that to effectively squash suspected behavior in the short term.

Maybe they can't dole out a substantive punishment, but when I was growing up they absolutely would lean on kids for even being suspected of doing something, or even if they hadn't done it yet, but the administration could see it coming. Sure they might of wasted some time on kids that truly weren't up to anything, but there generally weren't actual punishments of consequence on those cases. I'm pretty sure that a few things were prevented entirely, just by the kids being told that the administration sees it coming.

So they should have at least been able to effectively suppress the student body behavior while they worked out the truth.

[–] klugerama@lemmy.world 33 points 1 week ago

What? RTFA. 2 boys were charged by the Sheriff's department. They didn't face any punishment from the school, but law enforcement definitely investigated.

[–] pelespirit@sh.itjust.works 22 points 1 week ago

When the sheriff's department looked into the case, they took the opposite actions. They charged two of the boys who'd been accused of sharing explicit images — and not the girl.

[–] Lucidlethargy@sh.itjust.works 10 points 1 week ago (1 children)

Must be a majority republican police department.

[–] BarneyPiccolo@lemmy.today 3 points 1 week ago

It's Louisiana, what do you think?

I think you may have read the wrong article.

[–] Typhoon@lemmy.ca 3 points 1 week ago

No! Are you trying to get the perpetrator hired into Trump's cabinet?!

[–] juko_kun@sh.itjust.works -3 points 1 week ago (1 children)

I mean, law enforcement doesn't have enough resources to go after people making real CP.

What makes you think they can go after everyone making fake CP with AI?

[–] phoenixz@lemmy.ca 2 points 1 week ago

They do have resources, especially in the US. They do go after real cp and people go to jail on a near daily basis for it.

This too, could have been investigated better, which is kind of the point of the article

Why are you so okay with child pornography? Checking your message history really shows you being completely fine with CP, yet you really have it out for the victim