this post was submitted on 16 Mar 2025
569 points (97.5% liked)
Greentext
5698 readers
1856 users here now
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean, Lemmy used to have a big issue with CSAM being spammed by malicious users. Many people believed that it was pro-Reddit trolls, because it started happening right around the same time as the giant API debacle. It was a huge liability for the instance owners, because their server would automatically cache the content and they could be held liable for the CSAM being present on their server. It took a few months of dev time to add moderation tools, blacklisting, setting up automods, etc before it finally calmed down to the point that instance owners felt comfortable again.
By your logic, every single user in instances that got spammed should be banned. Because even if they didn’t see it, or interact with it in any way, they’re still personally responsible for it. After all, personal responsibility doesn’t stop existing in a large group of people.
Lemmy is celebrating cracking 50,000 users. Discord has 200 million MAU. They take a broader approach to punishment because it's the only feasible way to avoid legal problems.
IIRC some Lemmy instances were defederated at that time for poor moderation and nobody complained. Its a reasonable approach to avoid liability.