Tell ChatGPT you want to use Flask to use the openAI API to present a chat interface on a webpage, and to give you the requisite python and HTML code for it. You can then tinker with replacing the API with locally hosted models-- you’ll probably need something like llama-2.
I’m not saying you use ChatGPT for the final product, just to help you set up the code. Once you have something working, replacing the API with another model becomes a considerably easier problem.
Flask is a Python backend for browser applications, and most of the language models are native to Python. If you want to display language model output in a browser, flask is a great starting point. I was suggesting having ChatGPT write your initial code as a starting point, not as the ultimate finished product, because that would solve specifically the problem you mentioned in your initial post–displaying LLM output on HTML.
I’ve actually done exactly this before for my own tinkering projects, so I know for sure it can be done.
Tell ChatGPT you want to use Flask to use the openAI API to present a chat interface on a webpage, and to give you the requisite python and HTML code for it. You can then tinker with replacing the API with locally hosted models-- you’ll probably need something like llama-2.
ChatGPT is not it!
I’m not saying you use ChatGPT for the final product, just to help you set up the code. Once you have something working, replacing the API with another model becomes a considerably easier problem.
Flask is a Python backend for browser applications, and most of the language models are native to Python. If you want to display language model output in a browser, flask is a great starting point. I was suggesting having ChatGPT write your initial code as a starting point, not as the ultimate finished product, because that would solve specifically the problem you mentioned in your initial post–displaying LLM output on HTML.
I’ve actually done exactly this before for my own tinkering projects, so I know for sure it can be done.