parasocks@alien.topB to LocalLLaMA@poweruser.forumEnglish · 1 year agoAnyone spend a bunch of $$ on a computer for LLM and regret it?message-squaremessage-square22fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1message-squareAnyone spend a bunch of $$ on a computer for LLM and regret it?parasocks@alien.topB to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square22fedilinkfile-text
minus-squareiwishilistened@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoI was building an app and then realized it was cheaper to just call inference API for Llama on Azure lol. Put my local llama on hold now
I was building an app and then realized it was cheaper to just call inference API for Llama on Azure lol. Put my local llama on hold now