this post was submitted on 30 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Is the 3060 enough to be used in a small office for document questions and answers ? What is the best model to use? The cpu and RAM will be 4790k 32 gb.

top 2 comments
sorted by: hot top controversial new old
[โ€“] alyxms@alien.top 1 points 11 months ago (1 children)

My office PC had a RTX2060 12G and it runs 13b models at 4bit no problem.

That's pretty much it's limit though. 13b 4bit + 4096 context would max out the vram, but it is stable.

[โ€“] gpt872323@alien.top 1 points 11 months ago