I keep seeing people posting about how the new Phind is the most amazing thing on the planet, and I kept thinking “We already have Phind… see, I have the gguf right here!”

I finally looked at the phind paper on their newest model, and it says that their current model is v7.

https://www.phind.com/blog/phind-model-beats-gpt4-fast

:O Huggingface only goes up to v2.

I can’t tell if Phind is a proprietary model that just happened to give us an older version, if there will be newer versions coming out, or what. Does anyone happen to know?

  • m98789@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    V7 is propriety.

    This is the current business model / strategy:

    • Release an open model that performs well
    • Let the community embrace it and get hype
    • Find investors and sell story of next OpenAI
    • Get huge valuation and raise a ton of cash
    • Go closed source due to “competitive reasons”
    • Cash out some founder equity on next round with “dumb money” investors (pyramid scheme)
    • Party!
    • SomeOddCodeGuy@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Ok, this makes sense.

      I think that something that had confused me for a long time, and maybe I was the only one, was everyone kept saying “I prefer Phind!” for coding and I legitimately thought that they meant the v2 on huggingface.

      I always had better luck with CodeFuse-CodeLlama and couldn’t figure out why no one ever talked about it and everyone just talked about Phind. Now I realize I was off in left field using v2 all by myself while everyone else was on a website using Phind v6 lol

      I feel pretty silly now =D

    • peterwu00@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I have a question regarding the meta licensing below. Does it imply that we can use the open-source Llama 2 model as a foundational model, train it with additional data, and retain the retrained model as proprietary?

      “v. You will not use the Llama Materials or any output or results of the Llama Materials to improve any other large language model (excluding Llama 2 or derivative works thereof).”