Use the proper tools for the job. Either guidance (reborn just last week) or LMQL are two frameworks that can "enforce" any local model to output a json object.
this post was submitted on 20 Nov 2023
1 points (100.0% liked)
LocalLLaMA
4 readers
4 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 2 years ago
MODERATORS
Lllama.cpp supports BNF grammars. You basically tell the model runner the exact format of the output and where the data goes in it and even which list of values you expect, and it will only produce output in that format.
Useful for generating JSON and for document classification.