alchemist1e9@alien.topB to LocalLLaMA@poweruser.forumEnglish · 1 year agoExLlamaV2: The Fastest Library to Run LLMstowardsdatascience.comexternal-linkmessage-square22fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1external-linkExLlamaV2: The Fastest Library to Run LLMstowardsdatascience.comalchemist1e9@alien.topB to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square22fedilinkfile-text
minus-squareModeradorDoFariaLima@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoToo bad I think that Windows support for it was lacking (at least, last time I checked it). It needs a separate thing to make it work properly, and this thing was only for Linux.
minus-squareViennaFox@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoIt works fine for me. I am also using a 3090 and text-gen-webui like Liquiddandruff.
Too bad I think that Windows support for it was lacking (at least, last time I checked it). It needs a separate thing to make it work properly, and this thing was only for Linux.
It works fine for me. I am also using a 3090 and text-gen-webui like Liquiddandruff.