this post was submitted on 27 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
44GB of GPU VRAM? WTH GPU has 44GB other than stupid expensive ones? Are average folks running $25K GPUS at home? Or those running these like working for company's with lots of money and building small GPU servers to run these?
Dual 3090/4090s. Still pricey as hell, but not out of reach for some folks.
So anyone wanting to play around with this at home, has to expect to drop about 4K or so for GPUs and a setup?
I can get 2 3090 for 1200€ here on the second-hand market