this post was submitted on 01 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I'm curious if there's an ideal setup or pipeline that you can get an LLM to listen and "learn" from you if you just feed it info everyday like a personal diary? Would be interested to see how the model recalls or processes details of my life. Would you just use a web ui like oogabooga to feed info and adapt the model?

you are viewing a single comment's thread
view the rest of the comments
[–] Severin_Suveren@alien.top 1 points 1 year ago

You will need to feed the model with the conversation log every time you query it, and as such you'd be limited by the context length on the model.

With a 100k context model you'd be able to keep a chat log of about 70-100 000 words, which is about the length of a normal book.