I might be wrong, but this sounds a lot like a liquid neural network, able to adapt and change on demand.
LocalLLaMA
Community to discuss about Llama, the family of large language models created by Meta AI.
Is….. that a thing? I need that in my life.
May be some kind of memgpt..
Do you have a specific use case or need in mind? If you want it to remember things, you wouldn't necessarily 'feed it into an LLM' but if you want it to produce output more like how you'd speak, then fine-tuning would probably be appropriate.
Depending on what you wanna do, it will have different design requirements.
In general, I'd ask what's the desired goal first.
You will need to feed the model with the conversation log every time you query it, and as such you'd be limited by the context length on the model.
With a 100k context model you'd be able to keep a chat log of about 70-100 000 words, which is about the length of a normal book.
I am working on this exact product and the way I am approaching it is having a database with different levels of abstraction for each day.
I am working on this exact product and the way I am approaching it is having a database with different levels of abstraction for each day.
Couldn't you just timestamp each interaction (input and output?)