perlthoughts@alien.topB to LocalLLaMA@poweruser.forumEnglish · 1 year agoNew Model: openchat 3.5 with 16k contexthuggingface.coexternal-linkmessage-square20fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1external-linkNew Model: openchat 3.5 with 16k contexthuggingface.coperlthoughts@alien.topB to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square20fedilinkfile-text
minus-squarerkzed@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoI’m confused with their prompt format, do we really need to use their library to try the model?
minus-squareperlthoughts@alien.topOPBlinkfedilinkEnglisharrow-up1·1 year agonah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.
minus-squareinvolviert@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoThey were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.
minus-squareDear_noobs@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoI came across this yesterday, one interface to be able to jump between all the things. Find what you want to try, click Download, then chat with it…
I’m confused with their prompt format, do we really need to use their library to try the model?
nah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.
They were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.
I came across this yesterday, one interface to be able to jump between all the things.
Find what you want to try, click Download, then chat with it…