• 0 Posts
  • 7 Comments
Joined 11 months ago
cake
Cake day: October 25th, 2023

help-circle


  • It’s definitely not useless, it’s just doesn’t understand instructions as literally as big models. Instead of asking it to write eloquently, if you play with delivering instructions in the context of what you want, it’ll better convey the meaning.

    Bad: “Write like you’re in a wild west adventure.”

    Good: “Yee-haw, partner! Saddle up for a rip-roarin’, gunslingin’ escapade through the untamed heart of the Wild West!”

    You also can’t force it to what you want it to do. It depends on the training data. If I want it to output a short story, sometimes requesting a scenario, or book summary will give wildly different results.

    I think prompt engineering is entirely for smaller models.