Title says it all. Why spend so much effort finetuning and serving models locally when any closed-source model will do the same for cheaper in the long run. Is it a philosophical argument? (As in freedom vs free beer) Or are there practical cases where a local model does better.

Where I’m coming from is the requirement of a copilot, primarily for code but maybe for automating personal tasks as well, and wondering whether to put down the $20/mo for GPT4 or roll out my own personal assistant and run it locally (have an M2 max, compute wouldn’t be a huge issue)

  • Only-Letterhead-3411@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago
    • Local AI belongs to you, GPT-4 don’t. You are simply buying permission to use it for a limited time, and AI company can take AI from you anytime they want for any reason they like. You can only lose your local AI if someone physically removes it from your PC and you no longer can download it.
    • GPT-4 is censored and biased. Local AI have uncensored options.
    • AI companies can monitor, log and use your data for training their AI. With local AI you own your privacy.
    • GPT-4 requires internet connection, local AI don’t.
    • GPT-4 is subscription based and costs money to use. Local AI is free use.
    • allinasecond@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Are there any good tutorials on where to start? Im a FW engineer with a M1 Macbook, I dont know much about AI or LLMs

      • sarl__cagan@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        If you are cool just using the command line, ollama is great and easy to use.

        Otherwise, you could download LMStudio app on Mac, then download a model using the search feature, then you can start chatting. Models from TheBloke are good. You will probably need to try a few models (GGML format most likely). Mistral 7B or llama2 7B is a good starting place IMO.

      • jarec707@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        GPT4all may be the easiest on ramp for your Mac. 7b models run fine on 8gb system, although take much of the memory.