this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Yes. This has to be the worst ram you guys have ever seen but hear me out. Is it possible? I want to run the full 70gb model but that’s far out of question and I’m not even going to bother. Can I atleast run the 13gb or at least the 7gb?

you are viewing a single comment's thread
view the rest of the comments
[–] Aaaaaaaaaeeeee@alien.top 1 points 9 months ago (1 children)

Cramming mistral at 2.7bpw I get 2k. Are you talking about vram though?

[–] TheHumanFixer@alien.top 1 points 9 months ago

Nope regular ram