this post was submitted on 11 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

So I want to ask for advice on 2 related topics:

  1. If I have a corpus of many documents embedded in a vector store, how can I dynamically select (by metadata, for example) a subset of them and only perform retrieval on that subset for answer generation.

  2. I want LLaMa to be able to say I DO NOT KNOW if the context it retrieved cannot answer the question. This behavior is not stable yet from what I have seen.

Thank you so much!

you are viewing a single comment's thread
view the rest of the comments
[–] vec1nu@alien.top 1 points 1 year ago

Use something like lmql, guidance or guiderails to get the model to say it doesn't know. I've also had some success with the airoboros fine-tuned models, which have this behaviour defined in the dataset using a specific prompt.