I'm a bit annoyed at all the people being pedantic about the term hallucinate.
Programmers use preexisting concepts as allegory for computer concepts all the time.
Your file isn't really a file, your desktop isn't a desk, your recycling bin isn't a recycling bin.
[Insert the entirety of Object Oriented Programming here]
Neural networks aren't really neurons, genetic algorithms isn't really genetics, and the LLM isn't really hallucinating.
But it easily conveys what the bug is. It only personifies the LLM because the English language almost always personifies the subject. The moment you apply a verb on an object you imply it performed an action, unless you limit yourself to esoteric words/acronyms or you use several words to overexplain everytime.
On Discord though there's a lot of unchecked predation. Theoretically if this were implemented it would let them see the most suspicious users that interact with an unusual amount of children and review if the messages are inappropriate.
But all that's unlikely because if they actually cared they'd implement other simpler solutions first. So this idea is just hypothetical but not ideal.