this post was submitted on 27 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Hello All,

I’m a hobbyist who is interested in AI. I have a bunch of text containing messages from a popular instant messaging platform for gamers. There are thousands of messages in this data, and spread throughout it is very niche and valuable information, but there’s so many messages it’s hard to search through with the platform itself.

My idea was that if I could use a language model such as ChatGPT to parse, analyze, and search this data for me I could find answers to my questions quickly. ChatGPT is too expensive for this much data, which is why I am looking at processing this data locally. Is something like this possible? Ideally I would be able to ask it a question and it answers it based on the data it learned from my text data. I have just a standard gaming PC with an RTX 3080 Ti, I don’t need this solution to be fast.

top 2 comments
sorted by: hot top controversial new old
[–] Red_Redditor_Reddit@alien.top 1 points 11 months ago

I don't think this is going to help you. I'm not an expert but an amateur, so take what I say with a grain of salt. The context window is rather small, like a page or two of text. The model can only deal with given information inside that context window after training. It might be able to process the messages individually but it's not going to be able to be a sort of search engine for thousands of messages.

[–] Scary-Knowledgable@alien.top 1 points 11 months ago