I'll have a look into it and compare it to LM Studio.
SideShow_Bot
joined 11 months ago
So, in the end which one would you recommend for someone just beginning to run LLMs locally? Windows machine (thus Sanctum is out of the question for now). I'm interested in 3 use cases, so maybe there would be a different answer for each of them:
- Python coding questions
- Linux shell questions
- RAG: in particular, I would like to be able to ask questions and have the model retrieve an answer online, supported by one or more working hyperlinks
IIUC, for coding you suggest deepseek-coder-6.7b-instruct.Q4_K_M.gguf, right? Can I run it with 16 Gb? I'm on a i5 Windows machine, using LM Studio.