this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Hello, I'm a student delving into the study of large language models. I recently acquired a new PC equipped with a Core i7 14th Gen processor, RTX 4070 Ti graphics, and 32GB DDR5 RAM. Could you kindly suggest a recommended language model for optimal performance on my machine?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] IntelligentStrain409@alien.top 1 points 9 months ago (2 children)

Go buy two rtx3090, your mid tier gpu only has 12GB vram, I see your passion, but do some research, you need more VRAM.

Thank you for your advice. Unfortunately, I'm unable to purchase two RTX 3090s at the moment. Firstly, my budget is fully utilized with the current components, and secondly, I doubt my system can accommodate two RTX 3090s. Considering these constraints, could you kindly provide a recommendation based on my existing setup?

load more comments (1 replies)