this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

This is the reason why you can't find ones in your local best buy. They are paying premium for it. But it indeed is very helpful, if I can get my hand on a few for my build.

you are viewing a single comment's thread
view the rest of the comments
[–] thebliket@alien.top 1 points 10 months ago (4 children)

why are they getting 4090s when 3090s have the same 24gb memory?

[–] fallingdowndizzyvr@alien.top 1 points 10 months ago (3 children)

Because 4090s are faster. Companies don't use these things for inferring like most people do at home. That's low compute and basically memory bandwidth dependent. Companies use these for training. Which is high compute. A 4090 is much faster than a 3090.

And they are busy putting 48GB on those 3090s.

https://www.techpowerup.com/img/erPhoONBSBprjXvM.jpg

[–] alexgand@alien.top 1 points 10 months ago

What? Was the 48GB 3090 consumer available?

load more comments (2 replies)
load more comments (2 replies)