Oswald_Hydrabot

joined 1 year ago
[โ€“] Oswald_Hydrabot@alien.top 1 points 11 months ago

A marketing piece by OpenAI to lie to people to hype product

So if I make a 10 Trillion param mixture of experts model out of fine-tuned variations of the same 300b model I am safe right?

Or how about I train a 4 Trillion param model on a new architecture that can utilize distributed compute? If contribution to GPU pools for training is encrypted and decentralized then good luck.

Fuck OpenAI. We will take this fight to the streets and win.