this post was submitted on 20 Nov 2023
1 points (100.0% liked)

LocalLLaMA

14 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
 

Does anyone know how function calling works under the hood?

you are viewing a single comment's thread
view the rest of the comments
[–] tothatl@alien.top 1 points 2 years ago

Lllama.cpp supports BNF grammars. You basically tell the model runner the exact format of the output and where the data goes in it and even which list of values you expect, and it will only produce output in that format.

Useful for generating JSON and for document classification.