this post was submitted on 13 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Best? Goliath-120b. It's good, better than the 70b models I've used. Currently available on KoboldHorde if you want to try it, or there are GGUFs etc if you have the compute to run it locally. If that's just a little too rich for your tastes, then Xwin-70b is probably the go-to at high parameter counts.
Best with any sort of reasonable hardware requirements? Mlewd-20b is good, Xwin-Mlewd-13b is good, and some of the Mistral-7b merges are punching way above their weight. Check out Dolphin-2.2.1-Mistral-7b, and be amazed at the comparison with 7b models from 3 months back.