this post was submitted on 26 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

So I have the text-generation-ui by oogabooga running at one place, then I also have stable diffusion in the other tab. But I'm looking for ways to expose these project's APIs, and then combine them to then produce output like what GPT-4 does, where it can call APIs when it needs to, to other models.

I'm also looking for a solution where the text generation output is also able to execute the said code, and then infer from its results to do next things. (iknow the risks but yeah).

you are viewing a single comment's thread
view the rest of the comments
[–] Starkboy@alien.top 1 points 9 months ago

Thanks for your answer! I get it. These projects do give me some ideas. I didn't know such things are called 'agents' in this space