laterral

joined 1 year ago
[–] laterral@alien.top 1 points 11 months ago

Can any of the small ones do this well?

[–] laterral@alien.top 1 points 11 months ago

nice!! is it safe? tried to install it on the Mac and it kept on asking me for permissions that have nothing to do with it (e.g. access to Photos, etc.) - just found it strange

 

I'm only now wrapping my head around this - I know there's no option in the LM Studio UI, but is there any way to ingest documents once the LM studio loads the model in?

Also, any alternative stack that has a UI for not just chats, but also document ingestion for local models?

[–] laterral@alien.top 1 points 11 months ago (1 children)

How do you find the right template?

 

Hi team

I'm new to this and installed LM studio (I'm on a M1 Pro 16GB RAM). I'm looking for a model and I get a lot of options - which one to go for and why? (per the screenshot below)

Also, can you help me understand the capabilities of the machine, and some of the models you'd recommend for your use cases/ fun?

Thank you!!!

https://preview.redd.it/2jim35m0432c1.png?width=1650&format=png&auto=webp&s=970cecd07f537c5352428436a0f9f1840bf562f2

 

Hi team

There are a lot of components out there that come together in different configurations to conjure up AIs. Things like:

Xb model Y Fine tuning Hallucinations Llama Ollama LangChain LocalGPT AutoGPT PrivateGPT

All come up frequently.

Is there a good guide to build up my intelligence and vocabulary? Ideally some diagrams to help me understand how each component functions/ fits together.

I'm grateful for your help!!

 

Not sure I'm even asking the right questions (definitely my google searches are coming empty).

What I want is ideally an intuitive way to chain containers together (e.g. I want one to read up inputs from a telegram bot -> send the link to another that downloads the YouTube video -> depending on the size, for another to move the file around).

There must be solutions to orchestrate all this - can you help me go in the right direction please?