this post was submitted on 31 May 2025
187 points (88.8% liked)
Showerthoughts
34877 readers
834 users here now
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts:
- Both “200” and “160” are 2 minutes in microwave math
- When you’re a kid, you don’t realize you’re also watching your mom and dad grow up.
- More dreams have been destroyed by alarm clocks than anything else
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- No politics
- If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
- A good place for politics is c/politicaldiscussion
- Posts must be original/unique
- Adhere to Lemmy's Code of Conduct and the TOS
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI isn't taking off because it took off in the 60s. Heck, they were even working on neural nets back then. Same as in the 90s when they actually got them to be useful in a production environment.
We got a deep learning craze in the 2010s and then bolted that onto neural nets to get the current wave of "transformers/diffusion models will solve all problems". They're really just today's LISP machines; expected to take over everything but unlikely to actually succeed.
Notably, deep learning assumes that better results come from a bigger dataset but we already trained our existing models on the sum total of all of humanity's writings. In fact, current training is hampered by the fact that a substantial amount of all new content is already AI-generated.
Despite how much the current approach is hyped by the tech companies, I can't see it delivering further substantial improvements by just throwing more data (which doesn't exist) or processing power at the problem.
We need a systemically different approach and while it seems like there's all the money in the world to fund the necessary research, the same seemed true in the 50s, the 60s, the 80s, the 90s, the 10s... In the end, a new AI winter will come as people realize that the current approach won't live up to their unrealistic expectations. Ten to fifteen years later some new approach will come out of underfunded basic research.
And it's all just a little bit of history repeating.
I remember playing with neural nets in the late 1980s. They had optical character recognition going even back then. The thing was, their idea of "big networks" was nowhere near big enough scale to do anything as impressive as categorize images: cats vs birds.
We've hit the point where supercomputers in your pocket are....
19000/160 = over 100x as powerful as a Cray from the 1970s.
I just started using a $110 HAILO-8 for image classification, it can perform 26TOPS, that's over 160,000x a 1970s Cray (granted, the image processor is working with 8 bit ints, the Cray worked with 64 bit floats... but still... 20,000x the operational power for 1/436,000th the cost and 1/100,000th the weight.)
There were around 60 Crays delivered by 1983, HAILO alone is selling on the order of a million chips a year...
Things have sped up significantly in the last 50 years.