this post was submitted on 25 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You want large vram on your video card. What ever is the largest ram video card that you can afford is the answer.
Is there a list where it has all GPUs and it can be ordered my VRAM?