this post was submitted on 10 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
 

I made LLM therapist by fine-tuning an open source model using custom written and collected sessions in Cognitive Behavioral Therapy (CBT). Data contains conversations that illustrate how to employ CBT techniques, including cognitive restructuring and mindfulness.

It is mostly focused on asking insightful questions. Note: It is not production ready product. I am testing it and gathering feedback.

You can access it here: https://poe.com/PsychologistLuna

Sorry that it is on Poe but this way it was much faster than making my own mobile friendly website.

Since it's LLM, it is prone to hallucinate or give responses that might be perceived as rude. Please use it with caution.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Ronny_Jotten@alien.top 1 points 1 year ago (1 children)

How is it all that different from writing a self-help book though, in a legal sense? Why is there any bigger liability and ethical issue? There are millions of such books and articles published, full of all sorts of nonsense. A mentally unstable person might follow one and experience a bad outcome. No reason to stop releasing books though. A simple disclaimer seems to suffice. I understand that the experience of an LLM is not the same thing as reading a book, but it is in a sense just indexing and summarizing many texts in an interactive way. Why do you think it would be treated differently under the law, are there specific laws that apply to LLMs but not to books?

An easy difference is the data: you hardly give any personal data when buying a book, but you would have to expose your deepest secrets to this system. If the system is trained on interactions with patients, it may even leak this data through responses in the future.