this post was submitted on 01 Sep 2023
177 points (93.2% liked)
Technology
60082 readers
4022 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The issue isn't you being concise, it's throwing around words that don't have a clear definition, and expecting your definition to be broadly shared. You keep referring to understanding, and yet objective evidence towards understanding is only met with "but it's not creative".
Are you suggesting there is valid evidence modern ML models are capable of understanding?
I don't see how that could be true for any definition of the word.
As I've shared 3 times already: Yes, there is valid evidence that modern ML models are capable of understanding. Why do I have to repeat it a fourth time?
Then explain to me how it isn't true given the evidence:
https://arxiv.org/abs/2210.13382
I don't see how an emergent nonlinear internal representation of the board state is anything besides "understanding" it.
Cool. But this is still stuff that has a "right" answer. Math. Math in the form of game rules, but still math.
I have seen no evidence that MLs can comprehend the abstract. To know, or more accurately, model, the human experience. It's not even clear, that given a conscious entity, it is possible to communicate about being human to something non-human.
I am amazed, but not surprised, that you can explain a "system" to an LLM. However, doing the same for a concept, or human emotion, is not something I think is possible.
What are you talking about? You wanted evidence that NNs can understand stuff, I showed you evidence.
Yes, and math can represent whatever you want. It can represent language, it can represent physics, it can even represent a human brain. Don't assume we are more than incredibly complicated machines. If you want to argue "it's just math", then show me that anything isn't just math.
See? And that's the handwaving. You're talking about "the human experience" as if that's a thing with an actual definition. Why is "the human experience" relevant to whether NNs can understand things?
And the next handwave - what is a concept? How is "the board in Othello" not a concept?
Modern MLs are nowhere near complex enough to model reality to the extent required for genuine artistic expression.
That you need me to say this using an essay instead of a sentence, is your problem, not mine.
You'd have to bring up actual evidence for this. Easiest would be to start by defining "genuine artistic expression". But I have a feeling you'll just resort to the next handwave...
Thank you for confirming that your position doesn't make any sense.
Rude. Thanks for confirming my choice on minimizing the effort I spend on you, I guess.