--dany--

joined 1 year ago
 

This is the reason why you can't find ones in your local best buy. They are paying premium for it. But it indeed is very helpful, if I can get my hand on a few for my build.

[–] --dany--@alien.top 1 points 1 year ago (1 children)

Phind-CodeLlama 34B is the best model for general programming, and some techy work as well. But it's a bad joker, it only does serious work. Try quantized models if you don't have access to A100 80GB or multiple GPUs. 4 bit quantization can fit in a 24GB card.