this post was submitted on 23 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Amazon has the Acer A770 on sale for $250. That's a lot of compute with 16GB of VRAM for $250. There is no better value. It does have it's challenges. Somethings like MLC Chat run with no fuss just like on any other card. Other things need some effort like Oob, Fastchat and BigDL. But support for it is getting better and better everyday. At this price, I'm tempted to get another. I have seen some reports of running multi-GPU setups with the A770.

It also comes with Assassins Mirage for those people that still use their GPUs to game.

https://www.amazon.com/dp/B0BHKNK84Y

you are viewing a single comment's thread
view the rest of the comments
[–] No_Baseball_7130@alien.top 1 points 10 months ago (1 children)

P100s are also an okay-ish choice for super-budget builds (sxm is only 50$ but pcie is ~150$), but doesn't output video. It has a higher mem bandwidth as it use HBM instead of GDDR, and is faster than the p40 by a lot at 19.05 TFLOPS for FP16.

[–] fallingdowndizzyvr@alien.top 1 points 10 months ago (1 children)

The Mi25 is even faster at 24 TFLOPS for FP16. It's only $70-$90. You can get 2 for the price of one P100. And you can activate the mini-DP port on it for video with a BIOS flash. So you can use it to game with.

[–] No_Baseball_7130@alien.top 1 points 10 months ago

Mi25 is also an option, but a lot of programs are a lot more optimized for CUDA devices