this post was submitted on 12 Nov 2023
1 points (100.0% liked)
LocalLLaMA
1 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Maybe it's overkill, idk, but if you want higher accuracy, it's an option
You can just list examples from your dataset and let the LLM complete the last one
Example:
I don't know about the task you have in mind specifically, but you can do just about anything with a 13B llama model. Picking a fine-tune doesn't matter if you use examples instead of instructions. 7B Mistral seems to do fine with this example (even GPT2 can do some classification), but in-context learning is remarkably better at 13B, picking up a lot more nuance
+1, when in doubt, LLM it out.
You could also ask for explanations so when it gets it wrong, you can work on modifying your prompts/examples to get better performance.
Potentially you wouldn't want to do this if: