lordpuddingcup

joined 1 year ago
[–] lordpuddingcup@alien.top 1 points 11 months ago

you pick the biggest one, it's almost always the best unless it was truely a shitty trained model, a really well trained 30b with a 120b version 120b will be better, unless you mean by "can run them all" you mean can run full quant 7b and q1_k_m 120b lol

[–] lordpuddingcup@alien.top 1 points 11 months ago

This is literally hilarious, so what your saying is that everyone except the board is just gonna move over to MS, and OpenAI will buy back the remaining openai hardware for pennies on the dollar, and kick out the ~100 that stayed lol

[–] lordpuddingcup@alien.top 1 points 1 year ago

It runs a few models and if others decide to run models it runs with em just try the chat we app or the dashboard to see what’s currently running issue is not enough people donating compute