minus-squareparadigm11235@alien.topOPBtoLocalLLaMA@poweruser.forum•When training an LLM how do you decide to use a 7b, 30b, 120b, etc model (assuming you can run them all)?linkfedilinkEnglisharrow-up1·1 year agoI’m glad I goofed in my question because your response was super helpful, but I now realize I was missing the terminology when I posted. I was talking about fine tuning an existing model with a specific goal in mind, (re: poetry) linkfedilink
paradigm11235@alien.topB to LocalLLaMA@poweruser.forumEnglish · 1 year agoWhen training an LLM how do you decide to use a 7b, 30b, 120b, etc model (assuming you can run them all)?plus-squaremessage-squaremessage-square9fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareWhen training an LLM how do you decide to use a 7b, 30b, 120b, etc model (assuming you can run them all)?plus-squareparadigm11235@alien.topB to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square9fedilink
I’m glad I goofed in my question because your response was super helpful, but I now realize I was missing the terminology when I posted. I was talking about fine tuning an existing model with a specific goal in mind, (re: poetry)