this post was submitted on 08 Jan 2026
1052 points (99.3% liked)
Technology
78485 readers
2600 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It doesn't confuse us... it annoys us with the blatant wrong information. e.g. glue is a pizza ingredient.
That’s when you use 3 years old models
Are you trying to make us believe that AI doesn't hallucinate?
It doesn't, it generates incorrect information. This is because AI doesn't think or dream, it's a generative technology that outputs information based on whatever went in. It can't hallucinate because it can't think or feel.
Hallucinate is the word that has been assigned to what you described. When you don't assign additional emotional baggage to the word, hallucinate is a reasonable word to pick to decribe when an llm follows a chain of words that have internal correlation but no basis in external reality.
Trying to isolate out "emotional baggage" is not how language works. A term means something and applies somewhere. Generative models do not have the capacity to hallucinate. If you need to apply a human term to a non-human technology that pretends to be human, you might want to use the term "confabulate" because hallucination is a response to stimulus while confabulation is, in simple terms, bullshitting.
Words are redefined all the time. Kilo should mean 1000. It was the international standard definition for 150 years. But now with computers it means 1024.
Confabulation would have been a better choice. But people have chosen hallucinate.
Although I agree with you, you chose a poor example.
Kilo doesn't mean 1024, that's kibi. Many of us in tech differentiate because it's important.
No, but I was specifically talking about the glue and pizza example