this post was submitted on 27 Feb 2026
136 points (97.2% liked)

Technology

81933 readers
3865 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/43640522

If ChatGPT wants to replace health professionals, it should be held liable for the "advice" it gives.

you are viewing a single comment's thread
view the rest of the comments
[–] thebestaquaman@lemmy.world 33 points 14 hours ago* (last edited 14 hours ago) (4 children)

In 51.6% of cases where someone needed to go to the hospital immediately, the platform said stay home or book a routine medical appointment

So it performs slightly worse than a coin flip...

In one of the simulations, eight times out of 10 (84%), the platform sent a suffocating woman to a future appointment she would not live to see

Holy shit! That's a lot worse than a coin flip.

Meanwhile, 64.8% of completely safe individuals were told to seek immediate medical care

And there are real people out there that actually trust this tech to make real decisions for them. It literally performs significantly worse than a coin flip both with regards to false positives and false negatives. You are literally better off flipping a coin or throwing a dice than asking this thing what to do.

[–] Dave@lemmy.nz 8 points 10 hours ago

Even better than a coin flip is asking this what to do then doing the opposite!

[–] Atherel@lemmy.dbzer0.com 3 points 9 hours ago

You're even better off by doing the opposite of what chatgpt tells you to do.

[–] FallenWalnut@lemmy.world 8 points 14 hours ago

It is truly horrifying when you drill into the numbers.

I can see that it MIGHT be useful as a tool for medical professionals, but exposing it to the public is an insane risk.

[–] U7826391786239@piefed.zip 4 points 13 hours ago

they'll never be regulated because fascists love the mass surveillance. who cares about false positives--number of people bagged goes up either way