this post was submitted on 19 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Currently I have a 4090x2 and I am looking to upgrade my machine. Due to well known reasons, the price of 4090 and 3090 is insanely high right now, and I see another option: magic modding a 3080 with 20g of vram.

My aim is to use qlora to fine tune a 34B model, and I see that the requirement for fine tuning a 34B model using a single card from qlora is 24g vram, and the price of 4090x2 is about equal to 3080 20g x8. so what would be a better choice for a multi-card?

4090x2 or 3080 20g x8?

you are viewing a single comment's thread
view the rest of the comments
[–] FullOf_Bad_Ideas@alien.top 1 points 11 months ago

You already have rtx 4090 x 2? Then why upgrade, i don't get it. Tell me more about that magic modding, I can't find anything related online.