this post was submitted on 18 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Was wondering if theres anyway to use a bunch of old equipment like this to build an at home crunch center for running your own LLM at home, and whether it would be worth it.

you are viewing a single comment's thread
view the rest of the comments
[–] Murky-Ladder8684@alien.top 1 points 11 months ago

Those series of Nvidia gpus didn't have tensor cores yet and believe they started in 20xx series. I not sure how much it impacts inference purposes vs training/fine tuning but worth doing more research. From what I gathered the answer is "no" unless you use a 10xx for like monitor output, TTS, or other smaller co-llm use that you don't want taking vram away from your main LLM GPUs.