this post was submitted on 02 Aug 2025
44 points (100.0% liked)

Health - Resources and discussion for everything health-related

3750 readers
332 users here now

Health: physical and mental, individual and public.

Discussions, issues, resources, news, everything.

See the pinned post for a long list of other communities dedicated to health or specific diagnoses. The list is continuously updated.

Nothing here shall be taken as medical or any other kind of professional advice.

Commercial advertising is considered spam and not allowed. If you're not sure, contact mods to ask beforehand.

Linked videos without original description context by OP to initiate healthy, constructive discussions will be removed.

Regular rules of lemmy.world apply. Be civil.

founded 2 years ago
MODERATORS
 

Experts warn that average individuals can now experience the same sycophant-induced delusions as billionaires

top 6 comments
sorted by: hot top controversial new old
[–] IAmNorRealTakeYourMeds@lemmy.world 5 points 4 months ago* (last edited 4 months ago)

reminder not to blame desperate vulnerable people who are in dire need of help with is often unafordable or unavailable.

blame the corporations who are providing this for profit and with little to no safeguards.

[–] ninjabard@lemmy.world 3 points 4 months ago (1 children)
[–] Angry_Autist@lemmy.world 2 points 4 months ago

Depends on the user, I used DungeonAI back in the day to get over losing a family member and it was fine

On the other hand, my neighbor is convinced that some Character AI loves him sincerely and that he birthed the first ever conscious AI. He barely talks to his wife of 20 years anymore

[–] ExtremeDullard@lemmy.sdf.org -3 points 4 months ago* (last edited 4 months ago) (2 children)

Human shrinks, just like AI chatbots, are experts at slick-talking BS and know how to manipulate people.

The difference is, most human shrinks mean well and do try to help, while most AI chatbots are run by greedy monopolistic Big Data for-profits whose sole purpose is to "increase engagement".

[–] chirospasm@lemmy.ml 5 points 4 months ago* (last edited 4 months ago)

I would suggest that counselors / therpists, in fact, have backgrounds -- educational and experiential -- that support the 'slick-talking BS' you suggest, but that it is only slicktalking BS if you aren't willing to consider the benefit you get from relating to them in the way they were trained to relate.

This is important because the 'relating' is what has an impact more on you socially than the 'slicktalk.' It's the 'human-to-human' part that sticks to us longer than self-help books, prompts us to be open and considerate for change, and even supports our eventual ability for understanding ourselves a little better.

There is no 'relating' to an LLM. That LLM is weighted, in fact, to provide positive responses that meet the requesting of your text-based prompt.

If, in an LLM therapy session, I suddenly flip the script and write, 'Now pretend you are a far less confrontational therpaist who understands my feelings on X or Y that we've been talking about, and who doesn't want to press me on it as much,' then I am no longer even superficially attempting 'relate.' The cosplay of therapy is ripped away.

The 'relationship' part of therapy cannot happen authentically with an LLM if I can still control the outcome.

[–] Angry_Autist@lemmy.world 3 points 4 months ago

Tell me you've never been to therapy without saying you've never been to therapy