this post was submitted on 30 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
3090 doesn't support PCIE 5.0, only 4.0
The 4090 does, and it makes some sense to use them in x8 5.0 configuration, but only if you have a pallet of these GPUs.