this post was submitted on 19 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Currently I have a 4090x2 and I am looking to upgrade my machine. Due to well known reasons, the price of 4090 and 3090 is insanely high right now, and I see another option: magic modding a 3080 with 20g of vram.

My aim is to use qlora to fine tune a 34B model, and I see that the requirement for fine tuning a 34B model using a single card from qlora is 24g vram, and the price of 4090x2 is about equal to 3080 20g x8. so what would be a better choice for a multi-card?

4090x2 or 3080 20g x8?

you are viewing a single comment's thread
view the rest of the comments
[–] 314kabinet@alien.top 1 points 10 months ago (4 children)

What kind of mobo would you put the 8x 3080s into without killing the bandwidth and creating tons of headache for yourself in getting it to work? There was a similar post here recently. Just stick to the 4090s.

[–] WitchSayo@alien.top 1 points 10 months ago (1 children)

emmmm, I'm not sure, all I can do is just plug them into the motherboard.

Do you have a link to the post please? I'd like to check it out.

[–] CKtalon@alien.top 1 points 10 months ago (1 children)

Does your motherboard even have 8 PCIe slots? You’ll be needing server boards. Even workstation boards typically support only 7 (and can’t fit them all due to size)

[–] WitchSayo@alien.top 1 points 10 months ago (1 children)

Using a PCIe splitter cable, split the PCIe 4.0x16 x1 into PCIe 4.0x8 x2. And all gpu use a PCIe extension cable.

[–] The_Last_Monte@alien.top 1 points 10 months ago

I would recommend against this, not to say you are incapable o lr wouldn't figure it out, but it's a head ache. Just the power consumption on something like this alone would drive you to a sizable power bill, not to mention lack of any kind of Nvlink, your 4090 setup is a much better option in my opinion.

load more comments (2 replies)