minus-squareapepkuss@alien.topBtoLocalLLaMA@poweruser.forum•What’s recommended hosting for open source LLMs?linkfedilinkEnglisharrow-up1·10 months agoWebAssembly based open source LLM inference (API service and local hosting): https://github.com/second-state/llama-utils linkfedilink
WebAssembly based open source LLM inference (API service and local hosting): https://github.com/second-state/llama-utils