this post was submitted on 31 Oct 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

I want to build a PC for inference and training of Local LLMs and Gaming. I've decided to go with an RTX 4090 and a used RTX 3090 for 48GB VRAM for loading larger models as well as a decent enough speed.

What motherboard, PSU and Cabinet should I choose? Ideally I'd want to run both cards at least with x8 PCIe slots and will also add 128GB DDR5 RAM in this build.

Also, should I go with Intel i9-13900K CPU or with a Ryzen variant?

Thanks.

top 1 comments
sorted by: hot top controversial new old
[–] Moist_Influence1022@alien.top 1 points 10 months ago

Because you said you want use it for gaming stick with consumer boards

Dual gpu = z790 boards Memory = max 98gb/2x 48gb Sticks.

i hear alot about instability issues when 4 ddr5 sticks are used. So 128gb ddr5 is not an great option right now.

PSU 1600w maybe 1200 if you dont get a i9 13900k

but 1600w do be safe.

Thats basically my setup but with two used 3090.

everything else is a gamble because we dont know what will change in the future.