IntolerantModerate

joined 2 years ago
[–] IntolerantModerate@alien.top 1 points 2 years ago

The GPU limitations/requirements to go next level may also put a practical ceiling on things. Could you even run a model that was 10x larger than GPT4 without breaking the bank?