this post was submitted on 16 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Currently we manually set the temperature and keep it the whole chat. Wouldnt it make more sense to let the model decide a temperature itself depending on the topic?

top 4 comments
sorted by: hot top controversial new old
[–] Aaaaaaaaaeeeee@alien.top 1 points 1 year ago

"So, in contexts where the top token is 6%, a Min P of 0.1 will only consider tokens that are at least 0.6% probable. But if the top token is 95%, it will only consider tokens at least 9.5% probable."

Isn't there a big post on automatic temperature scaling?

[–] mcmoose1900@alien.top 1 points 1 year ago
[–] _Erilaz@alien.top 1 points 11 months ago

Isn't DynaTemp just that?

[–] a_beautiful_rhind@alien.top 1 points 11 months ago

Automatic temperature does work. It goes great with min_P. I like it more than mirostat and hope everything implements it.