this post was submitted on 29 Apr 2026
330 points (96.9% liked)

Technology

84302 readers
4310 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Blemgo@lemmy.world 2 points 1 day ago

That is true. However, 2 things have to be considered here:

  1. LLMs are easily manipulatable. So if the LLM says some advice, the person can easily spin it in a way where the LLM believes that its own advice doesn't apply even when it does. And admitting problems in oneself exist is harder in some people.
  2. LLMs can talk like a person, but will miss out on details about the other, making their advice rather boilerplate, which can be very hit or miss.

In contrast, people can overcome both hindrances. They can either try to make the other realize the issues they are denying are going on, or coerce the other to still try the advice. Generally, our gift of reading little aspects of how the other talks/behaves helps us communicate with the other a lot more than we think.