this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
top 3 comments
sorted by: hot top controversial new old
[–] CasimirsBlake@alien.top 1 points 11 months ago

Very useful!

32GB AMD Instinct cards for $500 would be a very compelling option... If the software stack wasn't still such a ballache to get working, and if they were more common used.

Quadro M6000 24GB cards also seem semi common and relatively decently priced if one hunts carefully... But how well do they perform?

[–] nero10578@alien.top 1 points 11 months ago

Not sure where they got 694GB/s for the Tesla P40, they're only 347GB/s of memory bandwidth.

[–] mcmoose1900@alien.top 1 points 11 months ago

I got excited when Gaudi was listed... But it was on sale precisely nowhere, lol.