this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I’ve recently completed my first PC build and loved the process of researching everything. Now I’m wondering what is the most powerful system someone can build locally? I know about the typical consumer facing high end products like the RTX 4090 and Ryzen 9 7950x. But Ive also seen server hardware, like AMD EPYC CPU (seen in the recent post here: https://www.reddit.com/r/LocalLLaMA/s/FXMyFnEx3m), A100, and HPUs, that have much greater capabilities. My goal is to eventually build a machine powerful enough to run multiple open source LLMs unquantized.

Say you had around $20,000 to build the most powerful home setup. How would you do it?

Note: I realize that $20,000 overkill for my goal. Just assume this is a thought experiment and money is no object.

you are viewing a single comment's thread
view the rest of the comments
[–] SirStagMcprotein@alien.top 1 points 11 months ago (1 children)

Yeah but what about the other specifics? For example, even the most high end consumer motherboards don’t have a lot of GPU slots. And their CPU ram is capped at 128 GB. Is there any high capacity motherboards that you could buy as a base that could be scaled? The scaling aspect is very important to me as I would like to be able to add to what I previously bought rather than having to replace it.

[–] a_beautiful_rhind@alien.top 1 points 11 months ago

Depends on what is left over after the GPU(s). Having at least a grand can net me older epyc boards that would solve those problems. Also more dense GPU mean you don't need so many GPU slots.