this post was submitted on 13 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I'm currently using 1650 4GB, AMD 5600, 32GB RAM.

I got some spare cash to throw to learn more about local llm.

Should I get: A. 64 GB RAM (2 X 32GB) B. 3060 12GB C. Intel A770 16GB.

I'm using openhermes 2.5 Mistral 7b q5k_m gguf, ok-ish Performace for Silly tavern with koboldcpp. But when context goes above 3k, it crawled.

Please let advise which option you think I should take first. Thanks bunch.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] mcmoose1900@alien.top 1 points 1 year ago (2 children)

So, definitely a new GPU. You can't go wrong with either, both will easily hold 7B or 13B at long context.

The 3060 will work better right now. It has support for much better backends at the moment and will be way faster.

I'm partial to the A770 because its stronger on paper, and I believe its going to get faster imminently with more support from various backends. It should be faster in the longer term. Also, I'm very salty about Nvidia's price gouging and anti competitiveness.

So... I guess it depends on when your next upgrade will be. I myself am thinking I will replace my 3090 with Intel's next gen GPUs if they're any good (and 24GB+).

[โ€“] yahdahduhe@alien.top 1 points 1 year ago

This is the dilemma for me. The a770 is the first gen GPU of Intel, while the news has been lately very promising, but battlemage seems to be just around the corner, based on rumours.

load more comments (1 replies)