this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Sorry for the noob question. I’m building out a new server and as I love playing with new tech, I thought I would throw in a GPU so I can try learn to integrate AI with things like Private GPT, document generation, meeting transcription, maybe some integrations with Obsidian, or even Home Assistant for automation. I like the idea of it being able to crawl all my information and offer suggestion, rather than me having to copy and paste snippets as I do now with Chat GPT. I’m a solo IT consultant by trade, so I’m really hoping it will help me augment my work.

Budget isn’t super important, it more that it’s fit for purpose, but to stop the people suggesting a £30,000 GPU, I cap it at ~£1000!

Thanks!

you are viewing a single comment's thread
view the rest of the comments
[–] theyreplayingyou@alien.top 1 points 11 months ago (1 children)

That depends, when you say you are building out a new server, are we talking a proper 1 or 2u dell, HPE, etc type server? If so you'll have to contend with the GPU footprint, for example my 1u servers can only take up to 2, half height, half length GPU's, and they can only be powered by PCIE so I'm limited to 75w.

In my 2u servers I can get the "GPU enablement kit" which is essentially smaller form factor heatsinks for the CPU's and some long 8pin power connectors to go from the the mobo to the PCIE riser, allowing many more options, but still there are problems to address with heat, power draw (CPUs are limited to 130TDP I believe) and the server firmware complaining about the GPU/forcing the system fans to run at an obnoxious level, etc...

If you are homebrewing a 3u, a tower or using consumer parts than things change quite a bit.

[–] idarryl@alien.top 1 points 11 months ago

Home brewing is exactly the way I’m going. A 3u is on the table, I’m looking at the sliger chassis’s at the moment, but cooling a space is a big consideration.

To be honestly I had only considered 1 GPU, I live in the UK and electricity prices are high (and houses are small), so multiple CPUs just don’t seem appropriate.