Save the $$$ for a few months and go and buy a used 3090 or two. It'll be worth it in the long run, and save any headaches of trying to frakenstein a bunch of 8 GB cards together.
this post was submitted on 15 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
I would replace the DDR5 ram rather than add to it or your memory will run a lot slower and you just don't need it if you're going to use gpus for inferencing. Also, a P40 is probably money better spent with this config than the P2200.
Thing is, I have the P2200 sitting in my shelf rn from my dads old workstation, so I wouldn't have to buy it.
13gb does not make for much. Especially when part of it is used for graphics and all old pascal architecture.
By all means just put the card is and see where it gets you on 13b.