this post was submitted on 22 Sep 2024
80 points (90.0% liked)

Socialism

6029 readers
56 users here now

Rules TBD.

founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] frightful_hobgoblin@lemmy.ml -4 points 9 months ago* (last edited 9 months ago) (46 children)

You're against computers being able to understand language, video, and images?

[–] Barabas@hexbear.net 54 points 9 months ago (13 children)

They don’t understand though. A lot of AI evangelists seem to smooth over that detail, it is a LLM not anything that “understands” language, video nor images.

There are uses for these kinds of models like semi-automating analysing large pools of data, but even in a socialist society the resources that allocated to do it like it is currently is completely unsustainable.

[–] frightful_hobgoblin@lemmy.ml 4 points 9 months ago (12 children)

They don’t understand though. A lot of AI evangelists seem to smooth over that detail, it is a LLM not anything that “understands” language, video nor images.

We're into the Chinese Room problem. "Understand" is not a well-defined or measurable thing. I don't see how it could be measured except from looking at inputs&outputs.

[–] space_comrade@hexbear.net 22 points 9 months ago (1 children)

"Understand" is not a well-defined or measurable thing.

So why attribute it to an LLM in the first place then? All of the LLMs are just floating point numbers being multiplied and added inside a digital computer, the onus is on the AI bros to show what kind of floating point multiplication is real "understanding".

[–] frightful_hobgoblin@lemmy.ml 3 points 9 months ago* (last edited 9 months ago) (2 children)

But it's inherently impossible to "show" anything except inputs&outputs (including for a biological system).

What are you using the word "real" to mean, and is it aloof from the measurable behaviour of the system?

You seem to be using a mental model that there's

  • A: the measurable inputs & outputs of the system

  • B: the "real understanding", which is separate

How can you prove B exists if it's not measurable? You say there is an "onus" to do so. I don't agree that such an onus exists.

This is exactly the Chinese Room paper. 'Understand' is usually understood in a functionalist way.

[–] anarchoilluminati@hexbear.net 13 points 9 months ago

But, ironically, the Chinese Room Argument you're bringing up supports what others are saying that LLMs do not 'understand' anything.

It seems to me like you are establishing 'understanding' with a functionalist meaning to be able to say that input/output is equivalent to understanding in order to say the measurable process in itself shows 'understanding'. But that's not what Searle, and seemingly the others here, seem to mean by 'understanding'. As Searle argues, it is not purely the syntactic manipulation in question but the semantic. In other words, these LLMs do not "know" the information they provide, they are just repeating based off the input/output process with which they were programmed. LLMs do not project or internalize any meaning to the input/output process. If they had some reflexive consciousness and any 'understanding', then they could have critically approach the meaning of the information in order to assess its validity against facts rather than just naïvely proclaiming that cockroaches got their name because they like to crawl into penises at night. Do you believe LLMs are conscious?

[–] space_comrade@hexbear.net 2 points 9 months ago* (last edited 9 months ago)

How can you prove B exists if it's not measurable?

Because I've felt it, I've felt how understanding feels, because ultimately understanding is a conscious experience within a mind, you cannot define understanding without referencing conscious experience, you cannot possibly define it only in terms of behavior or function. So either you have to concede that every floating point multiplication in a digital chip "feels like something" at some level or you show what specific kind of floating point multiplication does.

load more comments (10 replies)
load more comments (10 replies)
load more comments (42 replies)