this post was submitted on 09 Mar 2025
13 points (93.3% liked)

Asklemmy

46163 readers
1049 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?

top 11 comments
sorted by: hot top controversial new old
[โ€“] dawnslayer@lemmy.world 3 points 8 hours ago

Can someone explain this to me, cus huh?

[โ€“] 1rre@discuss.tchncs.de 2 points 8 hours ago

Websites have false positives all the time and while it sucks, it's infeasible for them to have human reviewers checking everything and it's better to have false positives than false negatives... What isn't acceptable is that the appeals process uses the exact same models as the flagging process so it gets the exact same false positives and false negatives...

Pic related as it was one of the first to reveal how broken the appeals process in most social media platforms was.

1000046823

[โ€“] andrewta@lemmy.world 13 points 1 day ago (1 children)

No. I see no way to prosecute over popcorn. If a country actually did I would find the exit. Fast!

Not condoning child abuse

But popcorn?

[โ€“] hoshikarakitaridia@lemmy.world 4 points 1 day ago (1 children)

Yeah.

Discord needs to moderate, so they ban and therefore conclude their legal obligations.

If it was CSAM and discord thinks it's bad enough, they will probably forward the information to the authorities.

Now if the authorities think it's worth an investigation and give it the proper priority, they will start one. If the investigation concludes and they still think you've done goofed bad enough, they will persue you under criminal law.

See how many ifs there are and how many people have to sign off on it? There's quadruple human review at minimum in there, and there's no way they think they can win on those charges when the evidence if gd damn popcorn.

Also, you can appeal a ban. I got auto banned on discord about 2 months ago and I appealed because I know for a fact I did nothing wrong - I was literally asleep and my last messages did not even contain profanity. I was so mad cause that account is important to me. They reinstated it - to their credit - in a matter of hours. Still, could've done without the heart attack.

TL;DR you're more than safe as long as it wasn't actual CSAM.

Yeah, like I found the exact same image posted on Twitter since 2 years and it is still up.

[โ€“] felixsu7@sh.itjust.works 7 points 1 day ago* (last edited 1 day ago)

Helpful video: https://www.youtube.com/watch?v=Kyc_ysVgBMs

But basically, the picture was a cropped frame from a CSAM content, which then their systems thought you are posting CSAM content when you did not.

About the legal consequences, i am not a lawyer, but i don't think you will be visited by the police anytime soon, since the picture you posted isn't CSAM by itself, just a cropped portion which does not contain the material itself.

Edit: as someone said, it goes through multiple human reviews.

[โ€“] AfricanExpansionist@lemmy.ml 3 points 1 day ago (1 children)

How does eating popcorn == CSAM?

[โ€“] SpatchyIsOnline@lemmy.world 7 points 1 day ago (1 children)

From what other people have said and from the occasional video that's popped up on Youtube, Discord has a library of CSAM content that its automated systems match against and there are certain individuals that try to bait people to post seemingly innocent pictures that are actually frames from said videos. Discord's systems see that the image is a frame from such material and will auto-ban the account

[โ€“] AfricanExpansionist@lemmy.ml 5 points 1 day ago (1 children)

This is fascinating and I have a bunch of questions, basically all centered around the fact that possession of the such content is outlawed. I don't exoect OP to know, but maybe someone else does:

Isn't it illegal to have a library of such content? Is there a legal carveout for that, like Coca Cola importing cocaine?

How is the library compiled, maintained, and added to?

Is the library specific to Discord or is it a shared library maintained by some centralized "authority" or developer? If it's specific to Discord then can we assume there are many different libraries of illegally produced and possessed content compiled and maintained by various social media companies? Who's got that job? Do they get therapy in their benefits package?

As far as I understand they use a tool called PhotoDNA (AI company acquired by Discord) which they use to scan pictures.

[โ€“] Inf_V@kbin.earth 4 points 1 day ago

no. they won't send the police after you. if they wanted to you wouldn't be online currently. it's just their stupid AI auto flagging things.