this post was submitted on 17 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I just use Linux. Stop the X server or don't install it altogether. (If you have it, just Ctrl+Alt+F1, log in, sudo systemctl stop lightdm.) Enjoy the entirety of your VRAM minus 1 MB available for you, so from there you run oobabooga or an API server or whatever and connect to it from your laptop.
As an added benefit, I can leave the GPU and all the noisy fans and their heat somewhere else in my home. And you don't need to connect a display to the Linux box either, just setup openssh-server, keys and work remotely.
Clever. I often run SD and textUI in --listen mode than use it on my IPad.