this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

This is the reason why you can't find ones in your local best buy. They are paying premium for it. But it indeed is very helpful, if I can get my hand on a few for my build.

you are viewing a single comment's thread
view the rest of the comments
[–] ElectroFried@alien.top 1 points 10 months ago

That 'depends'. Most of the crypto farms run on low cost motherboard/cpu combos with 8+ GPUs essentially connected via a single PCIe lane. If you wanted to do training or even inference on that, you would need to relocate those GPU's to a more capable system and then limit the number of cards to a maximum of 4 cards per system or less. At which point if you are talking about cards with 8GB or less VRAM you have an expensive to run and set up system with 32GB VRAM and fairly low performance. That is why the higher 16GB+ cards are all disappearing.