this post was submitted on 09 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Sentient is such a weird standard. It simply means having an experience, which is completely immeasurable. There is no means we will ever know, at all what is and isn't sentient, beyond guessing.
Self-awareness, cognition, higher reasoning, these are all somewhat measurable.
People use words like sentient and conscious without ever really defining them (in conversations in places like this, not in philosophy of mind) which is the cause of like half of the disagreements.
Someone walks in using sentient to mean phenomenally conscious, like it actually means, and then someone starts talking about self-awareness, what it means to them. And then a third person argues "you're crazy, that's not human level intelligence!" and no one ever stops to say "wait, are we even talking about the same thing?"
This happens even when talking about philosophy of mind itself, where you'd think it'd be clear. I saw a YouTube video by a YouTuber I thought was decent about panpsychism. The modern panpsychists have been pretty clear they're mostly talking about phenomenal consciousness, but the whole video was " these guys think that even atoms have a will and make decisions! The sun THINKS about having gravity to these nutjobs!", all entirely wrong, all because he didn't do the reading and find out what kind of consciousness they're talking about.
People use intelligence without a clear consensus ;)