this post was submitted on 30 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Did you forget to unset the rope settings?
Codellama requires different rope than regular llama.
Also check your sampler settings.
No I didn't even know rope was a thing, I'm reading about it now... if you have any tl;dr please post it, this stuff seems pretty complicated.
I was loading the model with a llama.cpp invocation, didn't know about rope. What would change if I left the default values on?