this post was submitted on 31 Oct 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

So Im looking for references on how to do function calling using Dolphin or Mistral models.

With my current prompt, I'm able to get it to choose an appropriate command for the task sometimes. But often it'll add multiple commands in one response. But the other half of the time it produces correct commands & parameters in json format as request. Sometimes it makes up commands it want to use that doesn't exist in the command list.

I'm just looking for hints at a more concrete prompt that will make these models effective in function calling.

Should I try whatever format OpenAI use seeing as how these smaller models are usually trained on synthetic data produced by OpenAI models?

Any guidance is appreciated ๐Ÿ‘

you are viewing a single comment's thread
view the rest of the comments
[โ€“] 1EvilSexyGenius@alien.top 1 points 1 year ago

Not a bad idea but I'm not coding in Python.

Because I hate myself, I'm writing in C# ๐Ÿซ 

Also, I want to use a few libraries as possible.

As a last resort I may use langchain. Or just look at the source and see how they force a model into function calling if it's possible.