this post was submitted on 25 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] xbimba@alien.top 1 points 2 years ago

Did he forget that GPT is a computerized AI, not a human? It doesn't become old, it keeps getting smarter and smarter.

[–] aluode@alien.top 1 points 2 years ago

The magic will come from the interactions of millions of people with super smart AI. People who were not able to program before can, nurses can be as efficient as doctors, people who have no clue about fixing things can fix the things with augmented reality glasses etc. If GPT 4 level AI was widely adopted (which it is not), it alone could change the society fundamentally. Now we are in the adoption face and the AI's as they get better, will move more and more to the center of our lives, as did cellphones. Will they get better? I think the models they were trained on will get better. They will get faster, they will have memory, ability to connect with the world through senses, the AI chips will make them vastly faster. The magic would happen even if they had plateaued, which they have not.

[–] maddybenfanti@alien.top 1 points 2 years ago

gates just saying gpt5 won't be much better but not addressing the limits of the text it's trained on is a big miss. we need to acknowledge the role of the underlying data in shaping the capabilities of these models.

[–] mimic751@alien.top 1 points 2 years ago

Speed, efficiency, relevancy can all the improved

[–] onyxengine@alien.top 1 points 2 years ago

Everytime someone makes this claim ai leaps ahead 30 years on the consensus of expected capabilities on the time line. AGI will probably be here before 2030.

[–] 0x00410041@alien.top 1 points 2 years ago

Yes Bill that's why we are now innovating around and adding other functionality to what an LLM can be. It is just one component of what people talk about when we discuss AGI which will be a combination of hundreds of systems interacting, each of which may be extremely powerful and complex individually.

[–] navras@alien.top 1 points 2 years ago

"640KB ought to be enough for anyone" - also Bill

[–] jewelry_wolf@alien.top 1 points 2 years ago

But what about Qstar?

[–] lpds100122@alien.top 1 points 2 years ago (2 children)

With all my respect to Bill, I clearly don't understand why we should to listen him. The guy has absolutely no vision of future!

He was ridiculously blind to WWW, blockchain technologies, smartphones, etc etc. Just let him leave in peace in his mansion. He is not a visionary and never was.

As to me personally and if I need a real hero, I would prefer to listen to Steve Wozniak.

load more comments (2 replies)
[–] IntolerantModerate@alien.top 1 points 2 years ago

The GPU limitations/requirements to go next level may also put a practical ceiling on things. Could you even run a model that was 10x larger than GPT4 without breaking the bank?

[–] Terrible_Button_1763@alien.top 1 points 2 years ago
[–] granoladeer@alien.top 1 points 2 years ago

Tell that to Ilya Sutskever

[–] LanchestersLaw@alien.top 1 points 2 years ago

I agree that the generation of LLM models will not be a breakthrough, but the overall capability increases from perfecting step-by-step and decision tree styles of prompting are very promising.

[–] workthebait@alien.top 1 points 2 years ago (1 children)

History shows that reddit group think is oftentimes wrong.

load more comments (1 replies)
[–] rathat@alien.top 1 points 2 years ago

I expect gpt5 is going to be a similar jump as gpt3 was to 4.

[–] Aesthetik_1@alien.top 1 points 2 years ago (1 children)

Just because Bill Gates says something doesn't mean that it's true

load more comments (1 replies)
[–] liongalahad@alien.top 1 points 2 years ago

I don't think Bill is right on this one, LLM may have achieved a plateau on performance with current architecture, but research is all about optimisation and efficiency, not mere parameter increase. Here's a good example:

https://venturebeat.com/ai/new-technique-can-accelerate-language-models-by-300x/

Remember GPT 5 is still at least 2-3 years away. Plenty of time.

[–] The_Krambambulist@alien.top 1 points 2 years ago

I think that is mostly fine.

In the end adoption of the technologies is most important. And there is still a long way to go.

Of course better technology would make it easier, but we can already make a lot of steps with what we have today.

[–] MiddagensWidunder@alien.top 1 points 2 years ago

Yet, people at r/singularity hype about OpenAI having secretly reached AGI.

load more comments
view more: ‹ prev next ›