this post was submitted on 22 Sep 2025
97 points (86.5% liked)
Technology
75734 readers
4741 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think I would be offended if some robot came up to try and provide emotional support. Its fake, and reminds me that our society values profit over human life. This should not be normalized as a necessity due to missing money that is going to the pockets of administrators, owners, insurance, and whoever else...this is pathetic.
Fr, fr. At least when a human fakes it there is the possibility that they aren't faking it. A machine that is incapable of feeling loses all ambiguity. It's even emptier than just pretending to be sympathetic.
The example in the article is of a kid patient.
I have to admit if I was a small child of 8-10 and Eve from Wall-E rolled up I'd be giddty to play with it, but even then I wouldn't expect it to replace actual human contact.
Sure, I just can't help but imagine how I personally would react if this became common for all patients.
I certainly would be outraged to have a bot assigned to me, as an adult.
Worth trying with kids. They play with dolls after all and imaginary friends.