this post was submitted on 01 Dec 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

I have a 4gb ram on my computer , so I’m going to assume it makes it impossible to run a fast model like that. But will it be slow to run or will it just not run completely?

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here