PacmanIncarnate

joined 1 year ago
[–] PacmanIncarnate@alien.top 1 points 11 months ago (1 children)

A 24 GB GPU is still limited to fitting a 13B fully in VRAM. His PC is a great one; not the highest end, but perfectly fine to run anything up to a 70B in llama.cpp

[–] PacmanIncarnate@alien.top 1 points 11 months ago

And fast. Not sure they’ll find something better.