this post was submitted on 27 Apr 2026
1174 points (98.6% liked)

Technology

84200 readers
4126 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] captainlezbian@lemmy.world 9 points 20 hours ago (1 children)

And I'd like independent studies to prove it's better than nothing before I'd recommend it to replace nothing. Especially when self guided mental health solutions such as meditation exist.

[–] LadyButterfly@reddthat.com -4 points 20 hours ago (2 children)

I don't see how nothing would be better someone using a good quality AI to for example, ground them during a panic attack.

[–] VeloRama@feddit.org 5 points 20 hours ago (1 children)

AI will not ground you, it will reinforce what you already believe. that's why it's very dangerous for "therapeutic" use.

[–] LadyButterfly@reddthat.com -2 points 19 hours ago (1 children)
[–] pinball_wizard@lemmy.zip 1 points 2 hours ago* (last edited 34 minutes ago)

Just FYI, there are experts in this thread telling you it doesn't depend which one.

Yes, some are worse than others. Yes, some have some trivial safeguards added for the worst known risks.

But no, none of them are remotely safe for use with self guided therapy.

As others have mentioned, anyone doing so would be much better off pirating or shoplifting the appropriate books, directly.

Responsible people using AI for expert knowledge always experience risk from the way the AI jumps immediately to the answer it thinks they want, ignoring all other available answers. :(

Edit: Sorry, I missed the context you were addressing. Yes! Certainly no one deserves the sucky consequences that can come with these tools just for seeking help!

[–] captainlezbian@lemmy.world 4 points 20 hours ago (1 children)

Because nothing doesn't run the risk of encouraging catastrophizing, acting on your heightened emotions, or coming to irrational conclusions. If it's consistently able to not do those things for a variety of people that's great. But as someone who had to learn to control her panic attacks, I absolutely can see advice and recommendations that are worse than nothing.

And yeah given llms' reputation for dealing with psychosis, delusions, and suicidality, I don't trust any of the technology compared to nothing, despite knowing how difficult nothing is for panic attacks.

[–] LadyButterfly@reddthat.com 0 points 19 hours ago (1 children)
[–] captainlezbian@lemmy.world 2 points 19 hours ago (1 children)

That's fair, but given the way the technology actually works, I stand by my position that there is a very real potential for harm and safer alternatives that are similarly accessible. If studies show it's safe and helpful that's cool, but at this moment I'd strongly discourage any loved one who's interested in using an llm for this purpose and would instead point them towards other resources.

[–] LadyButterfly@reddthat.com 1 points 17 hours ago (1 children)

Yes I see your point, and agree that actual therapy is dramatically safer and better

[–] captainlezbian@lemmy.world 2 points 4 hours ago

Yeah and I guess my point is that so is pirating a cbt workbook and trying it yourself