this post was submitted on 22 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

I am using kobold.cpp and it couldn't code anything outside of hello world. Am I doing something wrong?

https://preview.redd.it/xdo6q7a25z1c1.png?width=1454&format=png&auto=webp&s=30d0eaed2c6d4d95070f2312a4bc3add0dcc2840

you are viewing a single comment's thread
view the rest of the comments
[โ€“] vasileer@alien.top 1 points 10 months ago (1 children)

2 ideas

- use deepseek-coder-1.3b-instruct not the base model

- check that you use the correct prompting template for the model

[โ€“] East-Awareness-249@alien.top 1 points 10 months ago

It is the instruct model. You can see underneath the prompt box that it's the deepseek-coder-1.3b-instruct_Q5_K_s model. I used the prompting template in the model, and it slightly improved answers.

But if I ask if to write some code, it almost never does and says something gibberish.

Does your GPU/CPU quality affect the AI's output? My device is potato.