this post was submitted on 25 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes. This is known as Mixture of Experts (MOE).
We already have several promising ways of doing this:
I can't believe I hadn't run into this. Would you indulge me on the implications for agentic systems like Autogen? I've been working on having experts cooperate that way rather than being combined into a single model.