IntolerantModerate

joined 1 year ago
[–] IntolerantModerate@alien.top 1 points 11 months ago

The GPU limitations/requirements to go next level may also put a practical ceiling on things. Could you even run a model that was 10x larger than GPT4 without breaking the bank?