this post was submitted on 23 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Amazon has the Acer A770 on sale for $250. That's a lot of compute with 16GB of VRAM for $250. There is no better value. It does have it's challenges. Somethings like MLC Chat run with no fuss just like on any other card. Other things need some effort like Oob, Fastchat and BigDL. But support for it is getting better and better everyday. At this price, I'm tempted to get another. I have seen some reports of running multi-GPU setups with the A770.

It also comes with Assassins Mirage for those people that still use their GPUs to game.

https://www.amazon.com/dp/B0BHKNK84Y

you are viewing a single comment's thread
view the rest of the comments
[–] fallingdowndizzyvr@alien.top 1 points 11 months ago

They did. That's why software that uses Pytorch like FastChat and SD work very well with Intel Arc. But llama.cpp doesn't use Pytorch.

Here's the base of their software. An API that they are pushing as a standard since it also supports nvidia and AMD as well.

https://www.oneapi.io/

Also, Intel has their own package of LLM software.

https://github.com/intel-analytics/BigDL