this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Sorry for the noob question. I’m building out a new server and as I love playing with new tech, I thought I would throw in a GPU so I can try learn to integrate AI with things like Private GPT, document generation, meeting transcription, maybe some integrations with Obsidian, or even Home Assistant for automation. I like the idea of it being able to crawl all my information and offer suggestion, rather than me having to copy and paste snippets as I do now with Chat GPT. I’m a solo IT consultant by trade, so I’m really hoping it will help me augment my work.

Budget isn’t super important, it more that it’s fit for purpose, but to stop the people suggesting a £30,000 GPU, I cap it at ~£1000!

Thanks!

you are viewing a single comment's thread
view the rest of the comments
[–] Prudent-Artichoke-19@alien.top 1 points 9 months ago (1 children)

You can cluster 3 16GB Arc A770 GPUs. That's 48GB and modern.

[–] idarryl@alien.top 1 points 9 months ago (1 children)

3 x anything is not an option. I’m 1 and done. 👍🏻

[–] Prudent-Artichoke-19@alien.top 1 points 9 months ago

It just sucks because the sweet spot is 48GB but a single card is 3k usd at least.

At 1k you'll be stuck at 24GB for a single card.