this post was submitted on 17 Dec 2025
416 points (98.8% liked)

Fediverse

38832 readers
1386 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, Mbin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] okamiueru@lemmy.world 0 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

I think you might have missed my point. I wasn't listing stuff I had trouble understanding. I was listing stuff that didn't make much sense. The distinction is relevant. The end result, even if you manage to find some excuse that extends the already generous benefit of doubt, it still doesn't result in anything useful or informative.

I'm also not using fancy words (or..?). The only fancy thing that stands out is the the "Bloom filter", which isn't a fancy word. It's just a thing, in particular a data structure. I referenced it because its an indication of an LLM, in behaving like the stochastic parrot that it is. LLMs don't know anything, and no transformer based approach will ever know anything. The "filter" part of "bloom filter" will have associations to other "filters", even tho it actually isn't a "filter" in any normal use of that word. That's why you see "creator filter" in the same context as "bloom filter", even though "bloom filter" is something no human expert would put there.

The most amusing and annoying thing about AI slop, is that it's loved by people who don't understand the subject. They confuse an observation of slop (by people who... know the subject), with "ah, you just don't get it", by people who don't.

I design and implement systems and "algorithms" like this, as part of my job. Communicating them efficiently is also part of that job. If anyone came to me with this diagram, pre 2022, I'd be genuinely concerned if they were OK, or had some kind of stroke. After 2022, my LLM-slop radar is pretty spot on.

But hey, you do you. I needed to take a shit earlier and made the mistake of answering. Now I'm being an idiot who should know better. Look up Brandolini's law, if you need an explanation for what I mean.