this post was submitted on 25 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
China has a lot of used crypto GPU farms where you had racks of GPU's chugging away at crypto crunching. How hard would it be to convert them for A.I, use?
That 'depends'. Most of the crypto farms run on low cost motherboard/cpu combos with 8+ GPUs essentially connected via a single PCIe lane. If you wanted to do training or even inference on that, you would need to relocate those GPU's to a more capable system and then limit the number of cards to a maximum of 4 cards per system or less. At which point if you are talking about cards with 8GB or less VRAM you have an expensive to run and set up system with 32GB VRAM and fairly low performance. That is why the higher 16GB+ cards are all disappearing.
It depends on what you do with it. I think they can be very useful. Check my post elsewhere in this thread.
https://www.reddit.com/r/LocalLLaMA/comments/183na9z/china_is_retrofitting_consumer_rtx4090s_with_2/kasawk5/