this post was submitted on 17 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Almost the same syntax as Yi Capybara. Excellent.
I propose all Yi 34B 200K finetunes use Vincuna-ish prompt syntax, so they can ALL be merged into one hellish voltron model.
The deed is done:
https://huggingface.co/brucethemoose/Capybara-Tess-Yi-34B-200K
Seems coherent in transformers, I'm gonna quant it to exl2 and test it out.
Just wanted to come back and let you know I started using this last night, and this is fantastic. I haven't put it through much testing yet, but just know that on initial use I'm very impressed by this model for general purpose AI assistant. It's keeping to the Assistant's more informal speech patterns while also answering questions well and keeping up with large context. Those are 3 checkboxes I've never been able to check at once. This praise wont' get much visibility since it's an older thread, but just wanted to let you know at least.