104
Mom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technica
(arstechnica.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
This is referring to a bot designed to help with people struggling with mental health, and is actually a big one. That number is way too low.
“hey, I know you feel like killing yourself, but if it happens then we’ll just replace you with a shitty bot” probably isn’t as helpful as they thought it would be. It’s violating and ghoulish.
I hate this attitude of "well if you can't get a professional therapist, figure out how to get one anyways". There needs to be an option for people who either can't afford or can't access a therapist. I would have loved for AI to fill that gap. I understand it won't be as good, but in many regions the wait-list for therapy is far too long, and something is better than nothing
I would have loved AI to fill that need as well, but it's not an adequate tool for the job.