• arekku255@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      At a high level, merges allows you to get a lot of training done cheaply.

      If you have a model finetuned on set A and another model finetuned on set B, merging these would allow you to very cheaply create a model that was trained on both set A and set B.

      It is the magical solution to “I can’t afford to finetune a model”.