this post was submitted on 20 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

I'm still new to this and I thought that 128gb CPU ram would be enough to run a 70b model? I also have an RTX 4090. However, everytime I try to run lzlv_Q4_K_M.gguf in Text Generation UI, I get "connection errored out". Could there be a setting that I should tinker with?

you are viewing a single comment's thread
view the rest of the comments
[–] Herr_Drosselmeyer@alien.top 1 points 10 months ago

It should work with those specs. Not sure what "connection" it means. Perhaps post a screenshot of the console?