this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

This is the reason why you can't find ones in your local best buy. They are paying premium for it. But it indeed is very helpful, if I can get my hand on a few for my build.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] nexusjuan@alien.top 1 points 10 months ago (1 children)

I'm running a Tesla M40 12gb and I'm real close to pulling the trigger on a 24gb. I also have one of the Tesla P4's in my server. With the the M40 I can fully off load 13b models to vram.

[โ€“] thebliket@alien.top 1 points 10 months ago

how does a M40 compare with a A4000?