I’m still learning about LLM’s, I’ve seen that GTP4 can read websites or upload files. Is there a way to tell a Local LLM via the usual fronends to read a file and maybe learn from it to then make an informed response, or to sumarize it for example?
Also, can we feed it a website to get information from?
I have 2x3090 on the way so I can load bigger models with big contexts’ as I guess that would be necessary to read files.
You must log in or register to comment.
Have you tried superbooga? That link talks about it but you have to enable it in your text webui extensions.