this post was submitted on 13 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Quick answer: No.
Longer answer: It depends. Passing it as context won't work as it's too much data among other things. So you could use a model that builds SQL to query your database according to input and either output it directly or have another model (quantized 7B) interpret it.
But generally, i see the idea of the 'AI Assistant' come up here regularly, and the question is do you want to rely on a LLM that just 'makes things up' when accessing your notes. I guess that depends on how important the subject is.