this post was submitted on 18 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

So let's say we ask an LLM to predict what would happen if we put a pen on the table. And it simulates a thousand possibilities. Is there an LLM that would run perpendicular to these outputs as a sort of summarizer/filter. Is there a project working on anything like this?

Been looking, not finding. Thanks!

you are viewing a single comment's thread
view the rest of the comments
[–] Robot_Graffiti@alien.top 1 points 1 year ago

"Simulate" is probably too strong a word for what happens when you ask an LLM what happens in a physical scenario. It's more like "make up a story." It doesn't picture the scene in its mind or do any kind of physics, it only thinks about the words.

But, in principle, yeah, you could make it tell you a thousand stories, and have it classify each story into predefined categories, and then use some other program to count up the totals (LLMs suck at counting).