It's extremely overpriced. With INT4 llama.cpp does even crazier numbers. A system with 4090s can be made for $2500 in India & cheaper elsewhere for sure.
this post was submitted on 30 Oct 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
Didn’t Nvidia ban the use of consumer grade cards for professional uses? You will need to use A100s and whatnot for a datacenter
Interesting, but I think there will be considerable bias based on time of day, day of the week, season, closing date, etc.