this post was submitted on 25 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
(page 3) 30 comments
sorted by: hot top controversial new old
[–] suyash01@alien.top 1 points 11 months ago

Need to share this post with people fearing about their jobs being taken over by gpt

[–] vulgrin@alien.top 1 points 11 months ago

“GPT4 is enough AI for everybody”

[–] sigmatrophic@alien.top 1 points 11 months ago

Guys sell Microsoft this is a full court press something is f****** wrong with openai

[–] XB0XRecordThat@alien.top 1 points 11 months ago

Gpt 5, 6, 7 aren't as interesting as putting Reinforcement learning models on top of GPTs

[–] bartturner@alien.top 1 points 11 months ago

Not terribly surprised.

[–] dimtass@alien.top 1 points 11 months ago

I think it's good to have a plateau for a few years. The thing is that we just realised what LLMs can do and we need some time to learn how to get the best out of them and learn from them. Having this time the technology matures and at the same time we mature with that.

[–] Adihd72@alien.top 1 points 11 months ago

Cos Gates is the authority.

[–] JunkInDrawers@alien.top 1 points 11 months ago

It's true. However, the incremental improvements will have major implications in its implementation.

[–] RdtUnahim@alien.top 1 points 11 months ago

I remember saying that there was no reason to believe the explosive growth we've seen from AI would continue, and indeed no way to be sure that our current path in AI and LLM research would not eventually plateau and add a lot of years onto the way to AGI as we essentially go back to the drawing board to figure out a new path.

Got blasted by the GPT hype machine for even daring to suggest that such a thing could potentially happen, possibly. Feels good to have Bill Gates suggest the possibility too!

[–] blueeyedlion@alien.top 1 points 11 months ago

Well, if I can't see those reasons, I'm going to assume he's making stuff up to help Microsoft, which just had a panic over the openAI board firing the CEO because they were really worried about machine learning advancing too quickly.

[–] El_Minadero@alien.top 0 points 11 months ago (43 children)

I mean, everyone is just sorta ignoring the fact that no ML technique has been shown to do anything more than just mimic statistical aspects of the training set. Is statistical mimicry AGI? On some performance benchmarks, it appears better statistical mimicry does approach capabilities we associate with AGI.

I personally am quite suspicious that the best lever to pull is just giving it more parameters. Our own brains have such complicated neural/psychological circuitry for executive function, long and short term memory, types I and II thinking, "internal" dialog and visual models, and more importantly, the ability to few-shot learn the logical underpinnings of an example set. Without a fundamental change in how we train NNs or even our conception of effective NNs to begin with, we're not going to see the paradigm shift everyone's been waiting for.

[–] No_Advantage_5626@alien.top 1 points 11 months ago

Actually, the claim that "all ML models are doing is statistics" has proven to be a fallacy that dominated the field of AI for a long time.

See this video for instance, where Ilya (probably the #1 AI researcher in the world currently) explains how GPT is much more than statistics, it is more akin to "compression" and that can lead to intelligence: https://www.youtube.com/watch?v=GI4Tpi48DlA (4.30 - 7.30)

load more comments (42 replies)
load more comments
view more: ‹ prev next ›