this post was submitted on 25 Jul 2023
104 points (84.7% liked)

Fediverse

35084 readers
424 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration)

founded 2 years ago
MODERATORS
 

Not the best news in this report. We need to find ways to do more.

you are viewing a single comment's thread
view the rest of the comments
[–] MyFairJulia@lemmy.world 1 points 2 years ago (3 children)

Why would someone downvote this post? We have a problem and it's in our best interest to fix that.

[–] Aesthesiaphilia@kbin.social 1 points 2 years ago

Because it's another "WON'T SOMEONE THINK OF THE CHILDREN" hysteria bait post.

They found 112 images of cp in the whole Fediverse. That's a very small number. We're doing pretty good.

[–] chaogomu@kbin.social 1 points 2 years ago (1 children)

The report (if you can still find a working link) said that the vast majority of material that they found was drawn and animated, and hosted on one Mastodon instance out of Japan, where that shit is still legal.

Every time that little bit of truth comes up, someone reposts the broken link to the study, screaming about how it's the entire Fediverse riddled with child porn.

[–] MyFairJulia@lemmy.world 0 points 2 years ago (1 children)

So basically we had a bad apple that was probably already defederated by everyone else.

[–] ZILtoid1991@kbin.social 0 points 2 years ago (1 children)

It's Pawoo, Pixiv's (formerly) own instance, which is infamous for this kind of content, and those are still "just drawings" (unless some artists are using illegal real-life references).

[–] dustyData@lemmy.world 0 points 2 years ago (1 children)

They're using Generative AI to create photo realistic renditions now, and causing everyone who finds out about it to have a moral crisis.

[–] ZILtoid1991@kbin.social 0 points 2 years ago (1 children)

Well, that's a very different and way more concerning thing...

[–] Derproid@lemm.ee 1 points 2 years ago (1 children)

... I mean ... idk ... If the argument is that the drawn version doesn't harm kids and gives pedos an outlet, is a ai generated version any different?

[–] brain_pan@infosec.pub 1 points 2 years ago (1 children)

imo, the dicey part of the matter is "what amount of the AI's dataset is made up of actual images of children"

[–] Derproid@lemm.ee 1 points 2 years ago

Shit that is a good point.

[–] shrugal@lemm.ee 0 points 2 years ago* (last edited 2 years ago) (1 children)

The study doesn't compare their findings to any other platform, so we can't really tell if those numbers are good or bad. They just state the absolute numbers, without really going into to much detail about their searching process. So no, you can't draw the conclusion that the Fediverse has a CSAM problem, at least not from this study.

Of course that makes you wonder why they bothered to publish such a lackluster and alarmistic study.

[–] bumblebrainbee@lemmy.ml -1 points 2 years ago* (last edited 2 years ago)

Pretty sure any quantity of CSAM that isnt zero is bad....