this post was submitted on 29 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

I have been playing with llms for novel writing. Thus far all I have been able to use them for is brainstorming. No matter the model I use the prose feels wooden, dull, and obviously AI.

Is anyone else doing this? Are there particular models that work really well or any prompts you recommend? Any workflow advice you have to better leverage llms in any way would be very appreciated!

you are viewing a single comment's thread
view the rest of the comments
[–] kindacognizant@alien.top 1 points 9 months ago (2 children)

Play with your sampler settings. The impact in creativity changes pretty significantly.

See this, for example:

https://preview.redd.it/yg9jg6r4f93c1.png?width=595&format=png&auto=webp&s=f5f38dd788a60439bf83693dd67cbdef25bbe7d2

The important elements are:

- Min P, which sets a minimum % relative to the top probability token. Go no lower than 0.03 for coherence at higher temps.

- Temperature, which controls how much the smaller probability options are considered and makes them more probable.

[–] ambient_temp_xeno@alien.top 1 points 9 months ago (1 children)

I agree. I have these for yichat34 --top-k 0 --min-p 0.05 --top-p 1.0 --color -t 5 --temp 3 --repeat_penalty 1 -c 4096 -i -n -1

I think the --min-p I have is a bit low, so maybe you have the min-p back to front? Lower is more precise I think.

[–] pseudonerv@alien.top 1 points 9 months ago

--top-k 0 is the same as --top-k 1, so fully deterministic, no?