this post was submitted on 23 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not sure of the performance of that card, but you can get pre-owned 24gb Nvidia Tesla cards for less.
Used is not the same as new. Also, a P40 can't be used as a GPU like with graphics. No video out. These can.
You can use the P40 as a GPU via integrated GPU out, but that's not good solution.
The A770 seems like a way better option with that much more performance.
Those extra 8gb of the P40 are nice though.