This is the reason why you can't find ones in your local best buy. They are paying premium for it. But it indeed is very helpful, if I can get my hand on a few for my build.
Phind-CodeLlama 34B is the best model for general programming, and some techy work as well. But it's a bad joker, it only does serious work. Try quantized models if you don't have access to A100 80GB or multiple GPUs. 4 bit quantization can fit in a 24GB card.
Phind-CodeLlama 34B is the best model for general programming, and some techy work as well. But it's a bad joker, it only does serious work. Try quantized models if you don't have access to A100 80GB or multiple GPUs. 4 bit quantization can fit in a 24GB card.