this post was submitted on 09 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I am not sure it isn't sentient.
An ant is sentient and it's not going to tell you how many brothers Sally has either.
The real question is does consciousness spark into existence while all that transformer math resolves, or is that still completely unrelated and real life conscious brains are conscious due to completely dfferent emergent phenomenae.
To me, a big reason LLMs aren't conscious is that they only respond to user input, generate output and then stop. They don't talk to themselves. They aren't sitting their contemplating the meaning of their existence while you are away from the keyboard.