this post was submitted on 27 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Hello All,

I’m a hobbyist who is interested in AI. I have a bunch of text containing messages from a popular instant messaging platform for gamers. There are thousands of messages in this data, and spread throughout it is very niche and valuable information, but there’s so many messages it’s hard to search through with the platform itself.

My idea was that if I could use a language model such as ChatGPT to parse, analyze, and search this data for me I could find answers to my questions quickly. ChatGPT is too expensive for this much data, which is why I am looking at processing this data locally. Is something like this possible? Ideally I would be able to ask it a question and it answers it based on the data it learned from my text data. I have just a standard gaming PC with an RTX 3080 Ti, I don’t need this solution to be fast.

you are viewing a single comment's thread
view the rest of the comments
[–] Scary-Knowledgable@alien.top 1 points 9 months ago