this post was submitted on 21 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Has anyone of you tried any low to mid tier LLMs that run smoothly on the 4060 TI? Any thoughts?

you are viewing a single comment's thread
view the rest of the comments
[–] Pashax22@alien.top 1 points 10 months ago

Yes, of course. I'm running them on a 4070Ti with only 12Gb of RAM - sometimes I have to accept slower speeds, but I can still run more or less anything I might reasonably want to (and a few things that are distinctly UNreasonable).