I think llama 1 had more interesting training data, but it can’t hold a plot too well
LocalLLaMA
Community to discuss about Llama, the family of large language models created by Meta AI.
I think it's best to keep temperature low and feed the randomness in manually. Like generate a list of words and ask it to make an association for each and use those as inspiration for characters, plot, whatever. Or make a list of options for each thing that makes sense and have a model generate Python code to randomly select one of each, and then put those random selections in the prompt.
Because the temperature being much higher than zero mostly makes it dumber in my experience.
That's an interesting idea .. in my experience anything <1 works, >1.2 goes wild and for things we expect to be a bit more deterministic, setting it to 0 is preferred.
What's your best setup and temperature for creative writing ?
Same answer actually, for creative writing. 0
Why bother with any of the base models instead of using a writing specific model? DreamGen Opus 70B is pretty good and was tuned specifically for creative writing.