this post was submitted on 23 Apr 2024
773 points (98.7% liked)

Technology

58143 readers
4500 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] funn@lemy.lol 37 points 4 months ago (16 children)

I don't understand how Lemmy/Mastodon will handle similar problems. Spammers crafting fake accounts to give AI generated comments for promotions

[–] FeelThePower@lemmy.dbzer0.com 23 points 4 months ago (14 children)

The only thing we reasonably have is security through obscurity. We are something bigger than a forum but smaller than Reddit, in terms of active user size. If such a thing were to happen here, mods could handle it more easily probably (like when we had the spammer of the Japanese text back then), but if it were to happen on a larger scale than what we have it would be harder to deal with.

[–] BarbecueCowboy@kbin.social 15 points 4 months ago (2 children)

mods could handle it more easily probably

I kind of feel like the opposite, for a lot of instances, 'mods' are just a few guys who check in sporadically whereas larger companies can mobilize full teams in times of crisis, it might take them a bit of time to spin things up, but there are existing processes to handle it.

I think spam might be what kills this.

[–] deweydecibel@lemmy.world 7 points 4 months ago

If a community is so small that the mod team can be so inactive, there's no incentive for the company to put any effort into spamming it like you're suggesting.

And if they do end up getting a shit ton of spam in there, and it sits around for a bit until a moderator checks in, so what? They'll just clean it up and keep going.

I'm not sure why people are so worried about this. It's been possible for bad actors to overrun small communities with automated junk for a very long time, across many different platforms, some that predate Reddit. It just gets cleaned up and things keep going.

It's not like if they get some AI produced garbage into your community, it infects it like a virus that cannot be expelled.

[–] FeelThePower@lemmy.dbzer0.com 2 points 4 months ago

Hmm, good point.

load more comments (11 replies)
load more comments (12 replies)