Moist_Influence1022

joined 10 months ago
[–] Moist_Influence1022@alien.top 1 points 10 months ago

As someone who spends a lot of time on a chair in front of a PC, both as a hobby and for work, I treated myself to an early Christmas present with a dual 3090 machine.

I used to game a lot, but those days are over. It's still nice to be able to play the latest games on maximum graphics, but it's also great to have the capability to play around with the big boy LLMs out there.

Right now, I'm experimenting with so much stuff, trying different frameworks like autogen and memgpt. I tinker around without having this nagging thought in the back of my mind saying,

'Man, you're wasting money,' or 'Be more efficient,'

and so on, if you know what I mean. If it were just for the sake of trying LLMs, then definitely not. I would stick to cloud solutions.

 

Hello, I'm still new to this, but I want to focus on using a RAG and Vector DB to store all my personal and work-related data.

I'm seeking a better understanding of how things work.

I'm interested in covering multiple domains, such as "Sales," "Marketing," and "Security."

I plan to use an embedding model to create embeddings and then store them in a Vector Database. When I interact with my LLM, it should retrieve relevant data based on my prompt and feed this into my LLM query.

For instance:

"What's the command for xyz?" or "Create me a good offer for xyz."

As I understand it, there will be a backend semantic search for "Create me a good offer." Based on similarity, and possibly nearest neighbors, it will provide the LLM with context based on my prompt. The system's prompt for the LLM will then be based on this information to deliver the best possible answer.

Now, the big question is... when creating my dataset to store in the Vector DB, should I label the dataset with tags like [M] or [S] for sales? This way, when I type my prompt and add the label [S], the semantic search can more accurately determine where to look.

Does this approach make sense, or could it lead to more problems than it solves?

I mean i asked GPT4 but thas not the same as someone who maybe have some extended knowledge about this.

Thanks!

[–] Moist_Influence1022@alien.top 1 points 10 months ago

"Hey psshhh, AI is Bad and Evil so please regulate the fuck out if it, so we, Big Tech Corps can gain as much power as possible"

[–] Moist_Influence1022@alien.top 1 points 10 months ago

Because you said you want use it for gaming stick with consumer boards

Dual gpu = z790 boards Memory = max 98gb/2x 48gb Sticks.

i hear alot about instability issues when 4 ddr5 sticks are used. So 128gb ddr5 is not an great option right now.

PSU 1600w maybe 1200 if you dont get a i9 13900k

but 1600w do be safe.

Thats basically my setup but with two used 3090.

everything else is a gamble because we dont know what will change in the future.