this post was submitted on 29 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Probably stupid of me but I just bought a second 4090 for larger LLMs and was wondering if that was any faster than just having bought two 3090s

you are viewing a single comment's thread
view the rest of the comments
[–] ThisGonBHard@alien.top 1 points 11 months ago

3090 might be faster/around the same speed, as they have NV-Link.