this post was submitted on 16 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Here's how I managed to bootstrap generalized Tree-of-Thought capability in my AIs.

This was the secret sauce to SynthIA.

Generate your dataset with this, plus the Orca system prompts.

Open Source FTW. LFG!

โ€‹

https://preview.redd.it/45uyzynlen0c1.png?width=1744&format=png&auto=webp&s=694e69603c0656efbbea9a9e8b18d02a10c8633e

you are viewing a single comment's thread
view the rest of the comments
[โ€“] YourTechBud@alien.top 1 points 10 months ago

This seems really exciting. I'm kinda new to this, so sorry for asking such a noob question.

Is this a system message for the synthia model or Is this prompt for GPT4 to generate the dataset? If so, how do you generate a "generalized" dataset. Is it by passing different user prompts? If so, how do you decide what user prompts you should provide.

And is it right that you want to use that dataset to fine tune the model? So you could have a dataset in a particular domain to improve the resulting models reasoning capability in domain?