Mbando@alien.topBtoLocalLLaMA@poweruser.forum•Running full Falcon-180B under budget constraintEnglish
1·
1 year agoSuper-interested in this–really exciting!
Super-interested in this–really exciting!
We’ve been fine-tuning models for specific applications like RAG and structured data extraction. Falcon – 7B has been really good for training. It’s both shifting to understand the target domains use of language from the training data, but also picking up instructions really well. Going to try mistral-7B soon for a comparison.
Falcon – 7B fine tuned is pretty powerful. Within domain, and in a RAG stack it out performs GPT – 3.5.