this post was submitted on 01 Jun 2024
1613 points (98.6% liked)

Technology

59219 readers
2836 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] suction@lemmy.world 45 points 5 months ago (2 children)

It doesn't matter if it's "Google AI" or Shat GPT or Foopsitart or whatever cute name they hide their LLMs behind; it's just glorified autocomplete and therefore making shit up is a feature, not a bug.

[–] interdimensionalmeme@lemmy.ml 28 points 5 months ago (1 children)

Making shit up IS a feature of LLMs. It's crazy to use it as search engine. Now they'll try to stop it from hallucinating to make it a better search engine and kill the one thing it's good at ...

[–] lightnsfw@reddthat.com 3 points 5 months ago

Maybe they should branch it off. Have one for making shit up purposes and one for search engine. I haven't found the need for one that makes shit up but have gotten value using them to search. Especially with Google going to shit and so many websites being terribly designed and hard to navigate.

[–] Johanno@feddit.de 9 points 5 months ago (1 children)

Chatgpt was in much higher quality a year ago than it is now.

It could be very accurate. Now it's hallucinating the whole time.

[–] AFC1886VCC@reddthat.com 10 points 5 months ago (1 children)

I was thinking the same thing. LLMs have suddenly got much worse. They've lost the plot lmao