this post was submitted on 26 Nov 2023
1 points (100.0% liked)
Machine Learning
1 readers
1 users here now
Community Rules:
- Be nice. No offensive behavior, insults or attacks: we encourage a diverse community in which members feel safe and have a voice.
- Make your post clear and comprehensive: posts that lack insight or effort will be removed. (ex: questions which are easily googled)
- Beginner or career related questions go elsewhere. This community is focused in discussion of research and new projects that advance the state-of-the-art.
- Limit self-promotion. Comments and posts should be first and foremost about topics of interest to ML observers and practitioners. Limited self-promotion is tolerated, but the sub is not here as merely a source for free advertisement. Such posts will be removed at the discretion of the mods.
founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Tell ChatGPT you want to use Flask to use the openAI API to present a chat interface on a webpage, and to give you the requisite python and HTML code for it. You can then tinker with replacing the API with locally hosted models-- you'll probably need something like llama-2.
ChatGPT is not it!
I'm not saying you use ChatGPT for the final product, just to help you set up the code. Once you have something working, replacing the API with another model becomes a considerably easier problem.
Flask is a Python backend for browser applications, and most of the language models are native to Python. If you want to display language model output in a browser, flask is a great starting point. I was suggesting having ChatGPT write your initial code as a starting point, not as the ultimate finished product, because that would solve specifically the problem you mentioned in your initial post--displaying LLM output on HTML.
I've actually done exactly this before for my own tinkering projects, so I know for sure it can be done.