this post was submitted on 29 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Play with your sampler settings. The impact in creativity changes pretty significantly.
See this, for example:
https://preview.redd.it/yg9jg6r4f93c1.png?width=595&format=png&auto=webp&s=f5f38dd788a60439bf83693dd67cbdef25bbe7d2
The important elements are:
- Min P, which sets a minimum % relative to the top probability token. Go no lower than 0.03 for coherence at higher temps.
- Temperature, which controls how much the smaller probability options are considered and makes them more probable.
I agree. I have these for yichat34 --top-k 0 --min-p 0.05 --top-p 1.0 --color -t 5 --temp 3 --repeat_penalty 1 -c 4096 -i -n -1
I think the --min-p I have is a bit low, so maybe you have the min-p back to front? Lower is more precise I think.
--top-k 0 is the same as --top-k 1, so fully deterministic, no?