Llms are not meant for laptops unless your limit is 3b or 7b
this post was submitted on 01 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
You can run LLM on a CPU, models under and including 13B runs about as fast as one can read, 30B runs faster than real time chatting with someone and 70B runs.. about at speed as chatting with someone.
So running LLM on CPU is viable. I am also interested in comparison Ryzen vs iCore.