Hello,

With the new GPT4 and its 128k context window, my question is, is there an advantage to running a local LLM?

  • Cost Considerations: Is there a cost benefit to running a local LLM compared to OpenAI?

  • Specialized Training: Is it possible to train the model for specific tasks, similar to ‘Code Llama - Python’, but perhaps for areas like ‘Code Llama - Unreal Engine’?

I understand that for some applications, avoiding the content restrictions of OpenAI might be a plus. However, when it comes to using an LLM as a coding assistant, are there any specific advantages to running it locally?

  • krazzmann@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Azure allows to opt out of content logging for their OpenAI services. You need to apply for this option. This is what most larger companies, like my employer, are doing.