Here’s how I managed to bootstrap generalized Tree-of-Thought capability in my AIs.
This was the secret sauce to SynthIA.
Generate your dataset with this, plus the Orca system prompts.
Open Source FTW. LFG!
can someone explain what is meant by this
Generate your dataset with this, plus the Orca system prompts.
What model was used with that prompt for bootstrapping the data for a training set? Did you then take all that data and fine-tune it on the model used to bootstrap the initial dataset?
Aaaahyou are the guy who proposed HelixNet, the names sometime blurr on reddit. But reading your prompt it is clear you are very smart and not afraid to explore new ways.
I know my praise seems generic and there is lot great people around but I really think you are among top people.
Did you used GPT4 to generate dataset or other models?
Do you have an alternative version for chain of thoughts?
This seems really exciting. I’m kinda new to this, so sorry for asking such a noob question.
Is this a system message for the synthia model or Is this prompt for GPT4 to generate the dataset? If so, how do you generate a “generalized” dataset. Is it by passing different user prompts? If so, how do you decide what user prompts you should provide.
And is it right that you want to use that dataset to fine tune the model? So you could have a dataset in a particular domain to improve the resulting models reasoning capability in domain?