this post was submitted on 26 May 2024
736 points (98.4% liked)

Technology

59377 readers
4666 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] maxenmajs@lemmy.world 60 points 5 months ago (1 children)

Isn't the model fundamentally flawed if it can't appropriately present arbitrary results? It is operating at a scale where human workers cannot catch every concerning result before users see them.

The ethical thing to do would be to discontinue this failed experiment. The way it presents results is demonstrably unsafe. It will continue to present satire and shitposts as suggested actions.

[–] brbposting@sh.itjust.works 4 points 5 months ago

It won’t get people killed very often at all. Statistically there’s like no way you’ll know anybody who dies from taking a hallucinated suggestion. Give some thought to the investors who thought long and hard about how much money to put in. They worked hard and if a couple people a year have to die because of it how is that a bad trade off?

-kinda how it literally is almost unless the hubris is stronger than I imagine