this post was submitted on 13 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you're planning on running the models entirely on the GPUs, your choice of CPU won't really affect the speeds you are getting. I'd go with the Intel since this is your first PC build, I built a 7950X rig a couple months ago. I didn't have problems getting it to boot, but it absolutely had a fit over running 4 sticks of DDR5-6000 at their rated speed. The rated speed is really only valid for 2 sticks.