this post was submitted on 29 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Probably stupid of me but I just bought a second 4090 for larger LLMs and was wondering if that was any faster than just having bought two 3090s

you are viewing a single comment's thread
view the rest of the comments
[–] Dry-Vermicelli-682@alien.top 1 points 9 months ago

Honestly.. for $2K or so a pop for 4090s.. I'd have bought the M3 MAx Pro laptop with 128GB RAM. Video the other day showed it could load and work with 70b LLM just fine, but a single 4090 was not able to. Dual with 48GB ram probably can, but will likely be slower.. and having a top of the line laptop that doubles as a pretty beefy AI run box for about the same price seems like better money spent.