this post was submitted on 17 Nov 2024
9 points (68.0% liked)
Videos
14311 readers
357 users here now
For sharing interesting videos from around the Web!
Rules
- Videos only
- Follow the global Mastodon.World rules and the Lemmy.World TOS while posting and commenting.
- Don't be a jerk
- No advertising
- No political videos, post those to !politicalvideos@lemmy.world instead.
- Avoid clickbait titles. (Tip: Use dearrow)
- Link directly to the video source and not for example an embedded video in an article or tracked sharing link.
- Duplicate posts may be removed
Note: bans may apply to both !videos@lemmy.world and !politicalvideos@lemmy.world
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
YES
The transforms these LLMs are built on are not as efficient as they are novel. Without repeatability there is little hope for improvement. There isn't enough energy in the world to get to an AGI using a transform model. We're also running out of LLM free datasets to train on.
https://arxiv.org/html/2211.04325v2
https://arxiv.org/pdf/2302.06706v1
I really love that training llms on LLM output has been proven to cause it to unravel into nonsense. And rather than thinking about that before releasing, all these mega corps had to make profit in the short term first, and now the Internet is polluted with LLM output everywhere. I don't know that they will be able to generate a newer version than 2021