this post was submitted on 20 Oct 2023
489 points (98.2% liked)

Technology

59219 readers
4025 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.

you are viewing a single comment's thread
view the rest of the comments
[–] dreadedsemi@lemmy.world 109 points 1 year ago (2 children)

Couldn't they hire from watchpeopledie or nothingtoxic or ebaum. Those users probably would do overtime for free.

[–] brsrklf@jlai.lu 76 points 1 year ago (2 children)

People that are completely desensitized to that kind of stuff would probably not be very good at moderating it really.

Also this is a terrible job and I'd be very worried if a company was paying and enabling people who find that fun. It's horrible, but trauma is the normal outcome.

[–] WhatAmLemmy@lemmy.world 36 points 1 year ago (4 children)

Sounds like the perfect job for AI

[–] nodsocket@lemmy.world 14 points 1 year ago

They used AI to flag the images but a human still had to search through them

[–] brsrklf@jlai.lu 9 points 1 year ago

I am of the kind that is very wary with what should or should not be an AI's job, and you know what, in this very particular case, I think I agree.

At least as a first filter, anyway.

[–] ThePrivacyPolicy@lemmy.ca 4 points 1 year ago

Huge industries emerging in this field right now for everything from this type of social media moderation to helping fight CSAM more effectively so humans aren't having to be a frontline for that type of material. This is one area I can really, really get behind AI on and see a very valid use case that isn't just marketing hype like so many others. I know there's some great stuff happening just based on my own field of employment and being close to a few things in the works this year.

[–] lemann@lemmy.one 1 points 1 year ago

I feel sorry for whichever researchers are in charge of training and fine tuning those models.... ouch

[–] fadingembers@lemmy.blahaj.zone 19 points 1 year ago (3 children)

Honestly I don't see an issue with it. If they can tell the difference between an image that should be moderated and one that shouldn't they can do the job and I seriously doubt the vast majority of people desensitized to that kind of content can't tell the difference. That's like the arguments that we shouldn't make graphic games or movies because people won't be able to tell the difference between them and reality. Not everyone can do every job and these people would be the perfect fit for it and we would spare others from getting hurt

[–] odelik@lemmy.today 15 points 1 year ago

Desensitized doesn't necessarily mean somebody doesn't have reactions to something. It just means they can compartmentalize those reactions and move forward and deal with the ramifications later.

EMTs, ER Doctors, and Nurses are largely desensitized to graphic trauma and can press through and get the job done. But that doesn't mean that they don't process those scenes later in both healthy and unhealthy ways (there's a few study out there that show ER staff have higher rates of alcoholism and substance abuse rates than the general public).

Tramua is trauma, whether you're desensitized or not.

[–] ParsnipWitch@feddit.de 6 points 1 year ago* (last edited 1 year ago)

It would be a highly unethical but interesting research to see if those people experience long-term consequences nevertheless. Or if being desensitizes really does give someone immunity.

[–] brsrklf@jlai.lu 5 points 1 year ago

Except, you know, we're talking people who are progressively desensitized to reality. So no, that's not comparable at all.

[–] killeronthecorner@lemmy.world 18 points 1 year ago (1 children)

You're not thinking awful enough

[–] dreadedsemi@lemmy.world 2 points 1 year ago

I've seen users laugh at horrific gore videos on some forums. I'm not sick, but was curious at one point and googled.