LLMs on neuroengine.ai should support way more than 400 words. Don't know exactly the limit.
LocalLLaMA
Community to discuss about Llama, the family of large language models created by Meta AI.
I'm not sure what the limit is on Text Generation UI which is fully local.
I don't think infermatic.ai has a limit either.
Technically the Horde doesn't have a limit, but most hosts are running 4K-8K context models:
Hey I'm one of the maintainers of chat.lmsys.org. We previously set this limit to avoid heavy compute but we are considering to increase it. how long is your input typically?
And is there a plan to provide paid APIs for the available models that we can use programmatically, like OpenAI API?
Hey, thank you for replying! 400 words per message are good for most uses, but sometimes I need longer messages like 600 words for example. It's great that chat.lmsys.org has many great models and they get updated all the time, so it would be great to be able to use these models with longer messages. Thanks!
Hey, I have a question please. Are you using the original models or quantized versions of them?
All are original and no quantization at all, for fair comparison.
Perfect!