this post was submitted on 23 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

I like 7b but 13b like orca2 are better, no? What is the best?

you are viewing a single comment's thread
view the rest of the comments
[–] Nice_Squirrel342@alien.top 1 points 10 months ago

I agree, it's my favourite 7b model too. I use it mainly to help me with bot personalities. It's too bad it's not really fine-tuned for roleplay, otherwise it would be wrecking. And yes, 16k is broken for me too.

In general I think it would be nice if people tried to mix several Mistral models more often, as it was with the Mistral-11B-CC-Air-RP. Yes, it has serious problems with understanding the context and the characters go into psychosis, but if you use a small quantization (like q 5-6) and minimum P parameter, it improves the situation a bit. It's just that apparently something went wrong when model merging. Otherwise, this model is really the most unique I've tried. Characters talk similarly to the early Character AI.

https://huggingface.co/TheBloke/Mistral-11B-CC-Air-RP-GGUF/tree/main?not-for-all-audiences=true