Sorry I was playing around with it for the last day… So far I prefer it over 34B Dolphin Yi (GGUF Q4_K_m)… As for context size I only used 8k and it was pretty good with going far back. It might be able to do 12K, idk.. Haven’t tried it.
Majestical-psyche
joined 1 year ago
He probably got repositioned somewhere else within Microsoft… probably.
Now we need someone to de-neuter this and make it less bias. I wonder how both the 7B and 13B compares with Mistral. So far it’s hype until I see other people’s results-tests.