ithkuil@alien.topBtoLocalLLaMA@poweruser.forum•Is LLaMA-1-65B or LLaMA-2-70B more creative at storytelling ?English
1·
1 year agoSame answer actually, for creative writing. 0
Same answer actually, for creative writing. 0
I think it’s best to keep temperature low and feed the randomness in manually. Like generate a list of words and ask it to make an association for each and use those as inspiration for characters, plot, whatever. Or make a list of options for each thing that makes sense and have a model generate Python code to randomly select one of each, and then put those random selections in the prompt.
Because the temperature being much higher than zero mostly makes it dumber in my experience.
Are they working on Mixture of Experts?
https://github.com/XueFuzhao/OpenMoE
Mixture of Experts may help with getting more performance from smaller models.
Is it possible to do this in a way that allows the model to choose whether to write normal text or to call one or more functions?