this post was submitted on 25 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
why are they getting 4090s when 3090s have the same 24gb memory?
Because 4090s are faster. Companies don't use these things for inferring like most people do at home. That's low compute and basically memory bandwidth dependent. Companies use these for training. Which is high compute. A 4090 is much faster than a 3090.
And they are busy putting 48GB on those 3090s.
https://www.techpowerup.com/img/erPhoONBSBprjXvM.jpg
What? Was the 48GB 3090 consumer available?