this post was submitted on 21 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Is this accurate?

you are viewing a single comment's thread
view the rest of the comments
[–] lxe@alien.top 1 points 10 months ago (1 children)

Agreed. Best performance running GPTQ’s. Missing the HF samplers but that’s ok.

[–] ReturningTarzan@alien.top 1 points 10 months ago

I recently added Mirostat, min-P (the new one), tail-free sampling, and temperature-last as an option. I don't personally put much stock in having an overabundance of sampling parameters, but they are there now for better or worse. So for the exllamav2 (non-HF) loader in TGW, it can't be long before there's an update to expose those parameters in the UI.