this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Hey guys, thinking of upgrading my PC, I’m a dev and i wanna run my own LLMs, it’s more to run my own copilot locally instead on relying on outside services. This is what I got now

Ryzen 7 3700x 32GB RAM 5500XT

Debating whether I should get a 3950x or 5800x3D as I can game abit better as well As for the GPU I might just go for the 4090, but if this is overkill please let me know. What you guys think?

you are viewing a single comment's thread
view the rest of the comments
[–] Ok_Brain_2376@alien.top 1 points 11 months ago

Is there a list where it has all GPUs and it can be ordered my VRAM?