this post was submitted on 20 Nov 2023
1 points (100.0% liked)

LocalLLaMA

4 readers
4 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
 

Does anyone know how function calling works under the hood?

top 2 comments
sorted by: hot top controversial new old

Use the proper tools for the job. Either guidance (reborn just last week) or LMQL are two frameworks that can "enforce" any local model to output a json object.

[โ€“] tothatl@alien.top 1 points 2 years ago

Lllama.cpp supports BNF grammars. You basically tell the model runner the exact format of the output and where the data goes in it and even which list of values you expect, and it will only produce output in that format.

Useful for generating JSON and for document classification.