tldr: can i just drop a used 3090 in my old PC or would be a 4060ti new safer option?
Hi all!
I really want to make my feet wet in running local LLM, especially
- inferencing of 7b models
- some QLora fun
I'd like also to have fun running bigger, quantized models and, if possible, finetune some smallish model like GPT2-XL (like 1B) but if it's feasible otherwise i'll just rent some cloud. A little bit of gaming (Escape from Tarkov) in my freetime would'nt hurt
I've figure it out that my best GPU options are :
- 4060ti 16gb for around 450€ new and hoping for some black friday deals
- 3090 24gb used for around 700€
My current (very old) pc spec are the following:
- i5 2500 3.3GHz
- 16gb DDR3
- Asus p8p67 LGA1155 ( 4x PCI-E 32 but bus width)
- AMR R9 270 Sapphire
- a 600 W PSU
So my questions are:
- Can I afford to invest all my budget in the 3090? I have a second PSU at home that will be used only to power the gpu out of the case
- Is it better to buy the 4060ti and use the remaining budget to upgrade older parts (in this case, which one?)
Thanks for the help guys!
I went 4060ti 16gb on small black Friday sale because I don't want to upgrade my PSU, and it's my workstation for both my jobs so I don't want to mess with a used or refurbished 3090 which would be a lot more money if I factor the PSU.
This way it costs me less than half as much for new gear and my goals is to just run 20b, 13b and smaller coding models and in time I feel like something with higher vram will come out at a reasonable price without requiring a huge PSU.
I also have 64 gigs of ram if I need to call on a larger model.
thanks for the feedback! I think my situation match a lot with your comment!