this post was submitted on 10 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

I'm pretty new to this entire field of LLMs. I've played around with a few of the models in the oobabooga ui and have been eyeing some of the other gui options on github as well.

Recently, I've stumbled upon a lot of terms like "Langchain" or "Rag" that seem super interesting. As far as I understand this, you can ingest data (text, files etc.) into one of your LLMs to analyze, summarize etc. However, I'm not quite sure how to do that. Would this be possible to do inside the oobabooga ui (which I liked the most up until now)? Are there some resources/projects you could point me towards? And how does all of that play with the limited context window of a local LLM?

you are viewing a single comment's thread
view the rest of the comments
[–] Chaosdrifer@alien.top 1 points 10 months ago

If you just want to try it out, install privateGPT on your local PC/Mac, no GPU required.