this post was submitted on 16 Sep 2024
36 points (97.4% liked)

Canada

7187 readers
488 users here now

What's going on Canada?



Communities


🍁 Meta


πŸ—ΊοΈ Provinces / Territories


πŸ™οΈ Cities / Local Communities


πŸ’ SportsHockey

Football (NFL)

  • List of All Teams: unknown

Football (CFL)

  • List of All Teams: unknown

Baseball

Basketball

Soccer


πŸ’» Universities


πŸ’΅ Finance / Shopping


πŸ—£οΈ Politics


🍁 Social and Culture


Rules

Reminder that the rules for lemmy.ca also apply here. See the sidebar on the homepage:

https://lemmy.ca


founded 3 years ago
MODERATORS
 

Inside a bustling unit at St. Michael's Hospital in downtown Toronto, one of Shirley Bell's patients was suffering from a cat bite and a fever, but otherwise appeared fine β€” until an alert from an AI-based early warning system showed he was sicker than he seemed.

While the nursing team usually checked blood work around noon, the technology flagged incoming results several hours beforehand. That warning showed the patient's white blood cell count was "really, really high," recalled Bell, the clinical nurse educator for the hospital's general medicine program.

The cause turned out to be cellulitis, a bacterial skin infection. Without prompt treatment, it can lead to extensive tissue damage, amputations and even death. Bell said the patient was given antibiotics quickly to avoid those worst-case scenarios, in large part thanks to the team's in-house AI technology, dubbed Chartwatch.

"There's lots and lots of other scenarios where patients' conditions are flagged earlier, and the nurse is alerted earlier, and interventions are put in earlier," she said. "It's not replacing the nurse at the bedside; it's actually enhancing your nursing care."

you are viewing a single comment's thread
view the rest of the comments
[–] delirious_owl@discuss.online -2 points 1 month ago (1 children)

This is how you get worse racism in hospitals

[–] girlfreddy@lemmy.ca 2 points 1 month ago (1 children)
[–] delirious_owl@discuss.online 2 points 1 month ago* (last edited 1 month ago) (1 children)

Black people are more likely to die (due to systemic racism), so AI says: save the white person.

We saw this a lot at the height of the pandemic, which is why many nurses argued that the best triage method was random selection.

As always the problem isn't inherently that AI exists. The problem is that humans trust its output and use that to make decisions (and the laws still allow them to do it in many jurisdictions).

[–] girlfreddy@lemmy.ca 3 points 1 month ago* (last edited 1 month ago)

But this isn't generative AI, where AI creates an outcome. It simply notified the staff OF the outcome of human-performed tests.

I get AI is scary. We should be wary of how much control we give it. But in this case it had no control over any outcome.