Void_0000@alien.topBtoLocalLLaMA@poweruser.forum•An non profit to develop OS llm was announced, live from ai-pulseEnglish
1·
1 year agoSo, how long until they sell out to microsoft? ^(/s)
So, how long until they sell out to microsoft? ^(/s)
How hard can it be?
Seriously though, what makes it require more VRAM than regular inference? You’re still loading the same model, aren’t you?
I self hosted searxng, but the problem is after I was done I realised that defeats most of the privacy benefits of searxng: If I’m the only one using it, then I might as well just be using the search engines themselves directly.
So now I also have firefox running in a docker container, searching random junk on searxng every couple of minutes.