this post was submitted on 25 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Meh I’m running 13bs on a 13900k 64 gb ddr5 and a 6900xt with LM studio and it’s faster than my office workstations 12900ks 3090ti. Sometimes ram and processor with decent VRM is enough.
Your comparison proves his point! 13b will fit snuggly in your 6900 this is a head on comparison of the cards!