this post was submitted on 15 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
is this a scam or what? none of the models above are from NurtureAI:
- zephyr-beta is trained by HuggingFace and is 32K by default
- neural-chat is from Intel
- synthia is from migtissera
Original links:
https://huggingface.co/HuggingFaceH4/zephyr-7b-beta
https://huggingface.co/Intel/neural-chat-7b-v3-1
https://huggingface.co/migtissera/SynthIA-7B-v2.0
NurtureAI extended the context size to 16k
the context was already 32K
https://preview.redd.it/5jl7c7a53i0c1.png?width=958&format=png&auto=webp&s=ae51ae2b52717bb5ab14bed76580e7e0a45075ed
So assuming this release does anything at all the only thing I can think of would be that instead of "hidden size" cause being 4k giving a 4k sliding window into 32k context it would be a hidden size of 16k giving a 16k window into the 32k context.
However that's just speculation on my part because... Otherwise the release means nothing... Which would be weird.
That's not what hidden size does.