this post was submitted on 20 Nov 2023
1 points (100.0% liked)
LocalLLaMA
11 readers
4 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Theoretically doable, practically unlikely. Battery life will take a significant hit, and the 3B/7B models don't provide THAT much benefit to just take that hit.
It is something to consider in the future, though. Like, 5 years from now we will probably have SoCs that are efficient enough to do it live.