this post was submitted on 22 Dec 2024
521 points (95.9% liked)
Technology
60112 readers
3296 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There is a bunch of research showing that model improvement is marginal compared to energy demand and/or amount of training data. OpenAI itself ~1 month ago mentioned that they are seeing a smaller improvements in Orion (I believe) vs GPT4 than there was between GPT 4 and 3. We are also running out of quality data to use for training.
Essentially what I mean is that the big improvements we have seen in the past seem to be over, now improving a little cost a lot. Considering that the costs are exorbitant and the difference small enough, it's not impossible to imagine that companies will eventually give up if they can't monetize this stuff.
Compare Llama 1 to the current state of the art local AI's. They're on a completely different level.
Yes, because at the beginning there was tons of room for improvement.
I mean take openAI word for it: chatGPT 5 is not seeing improvement compared to 4 as much as 4 to 3, and it's costing a fortune and taking forever. Logarithmic curve, it seems. Also if we run out of data to train, that's it.
Surely you can see there is a difference between marginal improvement with respect to energy and not improving.
Yes, I see the difference as in hitting the logarithmic tail that shows we are close to the limit. I also realize that exponential cost is a defacto limit on improvement. If improving again for chatGPT7 will cost 10 trillions, I don't think it will ever happen, right?