Ok-Goal@alien.topBtoLocalLLaMA@poweruser.forum•What’s recommended hosting for open source LLMs?English
1·
1 year agoIn our internal lab office, we’re using https://ollama.ai/ with https://github.com/ollama-webui/ollama-webui to locally host LLMs, docker compose provided by ollama-webui team worked like a charm for us.
At our lab, we’re using the latest version of the ollama-webui and it seems to have the OpenAI API support already, among many another new features (and an updated UI, which imo is a lot better). You might want to update to the latest version!