this post was submitted on 25 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] liongalahad@alien.top 1 points 11 months ago

I don't think Bill is right on this one, LLM may have achieved a plateau on performance with current architecture, but research is all about optimisation and efficiency, not mere parameter increase. Here's a good example:

https://venturebeat.com/ai/new-technique-can-accelerate-language-models-by-300x/

Remember GPT 5 is still at least 2-3 years away. Plenty of time.