this post was submitted on 07 Mar 2024
38 points (89.6% liked)
Tech
447 readers
2 users here now
A community for high quality news and discussion around technological advancements and changes
Things that fit:
- New tech releases
- Major tech changes
- Major milestones for tech
- Major tech news such as data breaches, discontinuation
Things that don't fit
- Minor app updates
- Government legislation
- Company news
- Opinion pieces
founded 8 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
When I read that I had some sort of epiphany - "wow - maybe our brains are just LLMs", and it felt weird. Probably not weird enough to change my model, but still weird.
Glad you wrote this comment - you said it so much better than I could have.
Edit - my model is going wild here. New thought - if our brains are LLMs, how do the brains in all the other species (without language) work? I guess a LLM is just a special case of a Large Sensory Input Model.
2nd edit - of course our brains are "just LLMs" - LLMs are special cases of computer simulations of neural networks modelled on brains. I know the logic is backwards and I'm a bit slow, but it still feels weird to read LLM written articles and realise that we use a more evolved version of the same process to do basically - everything.
AI =/= LLMs. AI are neural networks that are modeled after the human brain in every capacity possible on a current computers. Neural networks can be trained on text to create LLMs. They can be trained on photos to create image generators like stable diffusion. They can be trained on audio to speak exactly like someone or generate music. They can be put into control loops the learn movements for robots like boston dynamics. Neural networks are just small(for now) brains trained to do one thing.
We can already combine these to do pretty crazy things, they're only going to get more powerful, more efficient, more integrated, and more capable. AGI Singularity will happen, and probably sooner than we think.
Thanks! I've been working on this idea for quite a while. I post summaries and random thoughts occasionally hoping to refine my thinking to the point at which I'll feel comfortable writing a proper essay.
I like the name you've given the overarching system. That's been a bit of a struggle for me, so you've given me a better concept to work with. "Large Sensory Input Model" captures my thoughts better than my own "the brain is just a kind of LLM." That it's abbreviation "LSIM" also conjures connections to "simulation" is a bonus for me, because that also addresses my thoughts on how we understand some things and other people.
There is a fairly old hypothesis that something called "Theory of Mind" is basically our brain modelling and simulating other brains as a way to understand and predict the behaviour of others. That has explanatory power: empathy, stereotypes, in/out groups, better accuracy with closer relationships, "living on" through powerful simulations of those closest to us who have died, etc.
Thanks for the feedback!