this post was submitted on 09 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Hey guys,

is anyone using autogen with a local model to do multi agent stuff? I have used it with openai api but now wondering if their new assistants thing makes it obsolete? If I could run it all locally that would be a new large value add...

top 3 comments
sorted by: hot top controversial new old
[–] WaterdanceAC@alien.top 1 points 1 year ago

FYI - I've seen a couple of Medium articles about doing this.

[–] MGazer@alien.top 1 points 1 year ago

Well I did get Autogen working with local models through LM Studio. It works but I haven't found a good use case for it. I was wanting to set up something like chatdev (which I also got working through LM Studio) but it just doesn't write any code. I guess for now I'm waiting for a model that will do what I want and thinking over what I can actually do with it.

[–] SomeOddCodeGuy@alien.top 1 points 1 year ago

I just saw a video about this come up on my "home" page on youtube the other day. This guy does lots of stuff with oobabooga.

Disclaimer- haven't watched this all the way through, but it really looks to be what you need.

https://www.youtube.com/watch?v=FHXmiAvloUg