this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Sorry for the noob question. I’m building out a new server and as I love playing with new tech, I thought I would throw in a GPU so I can try learn to integrate AI with things like Private GPT, document generation, meeting transcription, maybe some integrations with Obsidian, or even Home Assistant for automation. I like the idea of it being able to crawl all my information and offer suggestion, rather than me having to copy and paste snippets as I do now with Chat GPT. I’m a solo IT consultant by trade, so I’m really hoping it will help me augment my work.

Budget isn’t super important, it more that it’s fit for purpose, but to stop the people suggesting a £30,000 GPU, I cap it at ~£1000!

Thanks!

you are viewing a single comment's thread
view the rest of the comments
[–] idarryl@alien.top 1 points 11 months ago

Home brewing is exactly the way I’m going. A 3u is on the table, I’m looking at the sliger chassis’s at the moment, but cooling a space is a big consideration.

To be honestly I had only considered 1 GPU, I live in the UK and electricity prices are high (and houses are small), so multiple CPUs just don’t seem appropriate.