this post was submitted on 22 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
 

I'm looking for insights and advice on extending the context window of the LLMs (most specifically Mistral).

Whether you're a researcher, developer, or enthusiast in the field, I'd love to hear about your experiences and recommendations. Are there any specific techniques, methodologies, or tools you've found effective in extending the context window for LLMs?

Additionally, if you've encountered challenges in this area, how did you overcome them? Any resources, papers, or community discussions you can point me to would be greatly appreciated.

you are viewing a single comment's thread
view the rest of the comments
[–] Thenameis_Meme@alien.top 1 points 11 months ago

have you considered utilizing sliding window techniques to expand the context window for LLMs? It's a commonly used approach that can effectively increase the context window without overwhelming computational resources. Additionally, leveraging hierarchical approaches or incorporating external knowledge sources could also be beneficial for extending the context window. Good luck with your exploration!