this post was submitted on 25 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Depends on what is left over after the GPU(s). Having at least a grand can net me older epyc boards that would solve those problems. Also more dense GPU mean you don't need so many GPU slots.