Is the bubble popping?
Technology
Which posts fit here?
Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.
Rules
1. English only
Title and associated content has to be in English.
2. Use original link
Post URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communication
All communication has to be respectful of differing opinions, viewpoints, and experiences.
4. Inclusivity
Everyone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacks
Any kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangents
Stay on topic. Keep it relevant.
7. Instance rules may apply
If something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.
Companion communities
!globalnews@lemmy.zip
!interestingshare@lemmy.zip
Icon attribution | Banner attribution
No, line must go up!
I think it’s going down.
Oh no.
Anyways...
To add a bit more background:
We already had two major AI winters: https://en.wikipedia.org/wiki/AI_winter
More related articles:
- Deep Learning’s Diminishing Returns: The cost of improvement is becoming unsustainable (from 2021)
- Paper: Will we run out of data? Limits of LLM scaling based on human-generated data
- Can AI Scaling Continue Through 2030?
My opinion: We're facing a lot of issues, energy, training data is finite, it might be likely that the current architecture of AI models will hit a ceiling. We already pump in lots of compute for ever diminishing returns. I'm pretty sure that approach won't scale towards AGI. Like outlined in the article.
But it doesn't need to keep growing exponentially to be useful. AI is hyped to no end. And it's a real revenue driver for companies. I'd say the bubble is over-inflated. Some people are bound to get disappointed. And in my eyes it's very likely that it won't keep growing at the current pace of the last two years. And ultimately we'd need to come up with some new inventions if we want AGI. As far as I know that's still utter sci-fi. Nobody knows how to revolutionize AI so it'll suddenly become 100x more intelligent. And it's unlikely that our current approach will get us there. But on the other hand no-one ruled out there is a possibility to do it with a more clever approach. I'd lower my expectations. There has been a lot of hype and unfounded claims. Things take their time. And the normal way things go is gradual improvement. But it's not a "grim" perspective either (like the author put it).
And I agree there is still "quite a bit of growth" left in the AI market. Especially once we get more hardware than just the latest Nvidia graphics card, it'll maybe make things more affordable and more adopted.
[...] a trend that we are seeing in that marketplace towards smaller models as the large foundation models are becoming quite expensive to build, train, and iterate on [...]
I think that's a good thing. Ultimately this is about making things more efficient. Any maybe realizing you don't need the same big model for every task. It surely democratizes things and allows people with consumer priced hardware to participate in AI.