this post was submitted on 13 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Easiest way on Windows: koboldcpp with TheBloke/dolphin-2_2-yi-34b-GGUF. Download both, then drag and drop the GGUF on top of koboldcpp.exe.
I'm getting broken replies in koboldcpp, although it runs perfectly in llamacpp for me. Not sure why, koboldcpp is my go to.
You need to rebuild or download a newer koboldcpp.