this post was submitted on 20 Oct 2025
937 points (99.4% liked)

Funny

12115 readers
3146 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] alsaaas@lemmy.dbzer0.com 58 points 1 week ago (7 children)

If you have legit delusions about chatbot romantic partners, you need therapy like a year ago

[–] TragicNotCute@lemmy.world 35 points 1 week ago (1 children)
[–] TheReturnOfPEB@reddthat.com 18 points 1 week ago* (last edited 1 week ago) (1 children)
[–] Aceticon@lemmy.dbzer0.com 4 points 1 week ago* (last edited 1 week ago)

According to the AI therapist both are "absolutelly right" even when contradicting each other.

[–] Rhaedas@fedia.io 21 points 1 week ago

If we had better systems in place to help everyone who needs it, this probably wouldn't be a problem. Telling someone they need therapy isn't helpful, it's just acknowledging we aren't aiding the ones who need it when they need it most.

I'll go further and say anyone who thinks any of these AI are really what they're marketed as needs help, as in education of what is and isn't possible. So that will cover all instances, not just the romantic variety.

[–] Mk23simp@lemmy.blahaj.zone 13 points 1 week ago (1 children)

Careful, you should probably specify that therapy from a chatbot does not count.

[–] DragonTypeWyvern@midwest.social 10 points 1 week ago

"help I've fallen in love with my therapist!" recursive error

[–] Aceticon@lemmy.dbzer0.com 3 points 1 week ago

I don't think therapy can cure Stupid.

Odds are, people who have delusions about romantic partners thanks to the ELIZA effect are probably either too poor or would be resistant to getting professional help.

[–] morrowind@lemmy.ml 0 points 1 week ago (1 children)

I don't think ppl with AI girlfriends have delusions of them being human or whatever. They know it's AI, thought they may ascribe some human feeling that isn't there

But also, end of day, maybe it doesn't matter to the as long as the model can still provide them emotional support

[–] zalgotext@sh.itjust.works 7 points 1 week ago (1 children)

There will come a time when your AI girlfriend's context window fills up and its responses become increasingly unhinged and nonsensical. The average person doesn't know to expect that though, so it probably is pretty harmful when someone's emotional support robot suddenly goes insane

[–] Test_Tickles@lemmy.world 1 points 1 week ago (1 children)

There will come a time when your AI girlfriend's context window fills up and its responses become increasingly unhinged and nonsensical.

Wait... So I have already been dating AIs and the didn't even know it? This explains a lot.

[–] TheBat@lemmy.world 3 points 1 week ago

Have you asked her to do simple arithmetic calculations? LLMs can't do that.