this post was submitted on 25 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] El_Minadero@alien.top 0 points 11 months ago (43 children)

I mean, everyone is just sorta ignoring the fact that no ML technique has been shown to do anything more than just mimic statistical aspects of the training set. Is statistical mimicry AGI? On some performance benchmarks, it appears better statistical mimicry does approach capabilities we associate with AGI.

I personally am quite suspicious that the best lever to pull is just giving it more parameters. Our own brains have such complicated neural/psychological circuitry for executive function, long and short term memory, types I and II thinking, "internal" dialog and visual models, and more importantly, the ability to few-shot learn the logical underpinnings of an example set. Without a fundamental change in how we train NNs or even our conception of effective NNs to begin with, we're not going to see the paradigm shift everyone's been waiting for.

[–] gebregl@alien.top 1 points 11 months ago (24 children)

We need a name for the fallacy where people call highly nonlinear algorithms with billions of parameters "just statistics", as if all they're doing is linear regression.

ChatGPT isn't AGI yet, but it is a huge leap in modeling natural language. The fact that there's some statistics involved explains neither of those two points.

[–] psyyduck@alien.top 1 points 11 months ago (2 children)

Let's ask GPT4!

You're probably talking about the "fallacy of composition". This logical fallacy occurs when it's assumed that what is true for individual parts will also be true for the whole group or system. It's a mistaken belief that specific attributes of individual components must necessarily be reflected in the larger structure or collection they are part of.

Here are some clearly flawed examples illustrating the fallacy of composition.

  • Building Strength: Believing that if a single brick can hold a certain amount of weight, a wall made of these bricks can hold the same amount of weight per brick. This ignores the structural integrity and distribution of weight in a wall.
  • Athletic Team: Assuming that a sports team will be unbeatable because it has a few star athletes. This ignores the importance of teamwork, strategy, and the fact that the performance of a team is not just the sum of its individual players' skills.

These examples highlight the danger of oversimplifying complex systems or groups by extrapolating from individual components. They show that the interactions and dynamics within a system play a crucial role in determining the overall outcome, and these interactions can't be understood by just looking at individual parts in isolation.

[–] kelkulus@alien.top 1 points 11 months ago

I dunno. The “fallacy of composition” is just made up of 3 words, and there’s not a lot that you can explain with only three words.

[–] MohKohn@alien.top 1 points 11 months ago

How... did it map oversimplification to... holistic thinking??? Saying that it's "just statistics" is wrong because "just statistics" covers some very complicated models in principle. They weren't saying that simple subsystems are incapable of generating complex behavior.

God, why do people think these things are intelligent? I guess people fall for cons all the time...

load more comments (21 replies)
load more comments (39 replies)