this post was submitted on 22 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
2 ideas
- use deepseek-coder-1.3b-instruct not the base model
- check that you use the correct prompting template for the model
It is the instruct model. You can see underneath the prompt box that it's the deepseek-coder-1.3b-instruct_Q5_K_s model. I used the prompting template in the model, and it slightly improved answers.
But if I ask if to write some code, it almost never does and says something gibberish.
Does your GPU/CPU quality affect the AI's output? My device is potato.