this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

11 readers
4 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] CasimirsBlake@alien.top 1 points 2 years ago

Very useful!

32GB AMD Instinct cards for $500 would be a very compelling option... If the software stack wasn't still such a ballache to get working, and if they were more common used.

Quadro M6000 24GB cards also seem semi common and relatively decently priced if one hunts carefully... But how well do they perform?