this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Hey guys, thinking of upgrading my PC, I’m a dev and i wanna run my own LLMs, it’s more to run my own copilot locally instead on relying on outside services. This is what I got now

Ryzen 7 3700x 32GB RAM 5500XT

Debating whether I should get a 3950x or 5800x3D as I can game abit better as well As for the GPU I might just go for the 4090, but if this is overkill please let me know. What you guys think?

top 8 comments
sorted by: hot top controversial new old
[–] FullOf_Bad_Ideas@alien.top 1 points 10 months ago

I upgraded from gtx 1080 to rtx 3090 ti 2 weeks ago. I think going with rtx 3090 / 3090 ti / 4090 would be a good option for you, I don't know how big of a difference having stronger cpu would have, I think exllama v2 has some cpu bottlenecking going on, but I have no idea what is computed on cpu and why. There were moments during generation where it seemed like it was using only 1 thread and it was maxing it out, being bottleneck for gpu. I don't think ram matters a lot unless you train and merge loras and models.

[–] You_Wen_AzzHu@alien.top 1 points 10 months ago (1 children)
  1. The only correct answer. CPU doesn't matter.
[–] SupplyChainNext@alien.top 1 points 10 months ago (1 children)

Meh I’m running 13bs on a 13900k 64 gb ddr5 and a 6900xt with LM studio and it’s faster than my office workstations 12900ks 3090ti. Sometimes ram and processor with decent VRM is enough.

[–] ntn8888@alien.top 1 points 10 months ago

Your comparison proves his point! 13b will fit snuggly in your 6900 this is a head on comparison of the cards!

[–] ntn8888@alien.top 1 points 10 months ago

Welcome to the rabbit hole 😁. On a serious note, going for the newer generations pays dividends, in my opinion.

[–] OneConfusion3313@alien.top 1 points 9 months ago

Curious, does anyone consider rtx A5000 / 6000? It’s twice the price of 4090/3090, but the performance improvement should be more than double

[–] ababana97653@alien.top 1 points 9 months ago (1 children)

You want large vram on your video card. What ever is the largest ram video card that you can afford is the answer.

[–] Ok_Brain_2376@alien.top 1 points 9 months ago

Is there a list where it has all GPUs and it can be ordered my VRAM?