For the amount of energy they consume, LLMs sure suck ass. Scaling has not improved their accuracy or 'intelligence'. In fact, they seem to be performing worse. Not to mention, scaling requires exponentially more training data, which they have run out of, apparently.
this post was submitted on 26 Jun 2025
7 points (100.0% liked)
Tech
1639 readers
6 users here now
A community for high quality news and discussion around technological advancements and changes
Things that fit:
- New tech releases
- Major tech changes
- Major milestones for tech
- Major tech news such as data breaches, discontinuation
Things that don't fit
- Minor app updates
- Government legislation
- Company news
- Opinion pieces
founded 1 year ago
MODERATORS