Basically, that’s my question. The caveat is that I would like to avoid a Mac Mini and I wonder if some of Minisforum’s mini PCs can handle LLM.

  • Scary-Knowledgable@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Access to powerful, open-source LLMs has also inspired a community devoted to refining the accuracy of these models, as well as reducing the computation required to run them. This vibrant community is active on the Hugging Face Open LLM Leaderboard, which is updated often with the latest top-performing models.

    That’s a nice indirect shout out.