I’m pretty new to this entire field of LLMs. I’ve played around with a few of the models in the oobabooga ui and have been eyeing some of the other gui options on github as well.
Recently, I’ve stumbled upon a lot of terms like “Langchain” or “Rag” that seem super interesting. As far as I understand this, you can ingest data (text, files etc.) into one of your LLMs to analyze, summarize etc. However, I’m not quite sure how to do that. Would this be possible to do inside the oobabooga ui (which I liked the most up until now)? Are there some resources/projects you could point me towards? And how does all of that play with the limited context window of a local LLM?
If you just want to try it out, install privateGPT on your local PC/Mac, no GPU required.