this post was submitted on 09 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Are these worth bothering with? Or money better spent on 2x3090 or A6000?

top 4 comments
sorted by: hot top controversial new old
[–] a_beautiful_rhind@alien.top 1 points 1 year ago

3090s are faster. P40s are mostly stuck with GGUF. P100s are decent for FP16 ops but you will need twice as many.

All depends on what you want to do. 8 cards are going to use a lot of electricity and make a lot of noise.

[–] IntrovertedFL@alien.top 1 points 1 year ago
[–] AutomaticDriver5882@alien.top 1 points 1 year ago

Don’t get those the token rate will be crazy slow

[–] NoWarrenty@alien.top 1 points 1 year ago