this post was submitted on 16 Apr 2026
35 points (97.3% liked)
Homelab
2137 readers
1 users here now
Rules
- Be Civil.
- Post about your homelab, discussion of your homelab, questions you may have, or general discussion about transition your skill from the homelab to the workplace.
- No memes or potato images.
- We love detailed homelab builds, especially network diagrams!
- Report any posts that you feel should be brought to our attention.
- Please no shitposting or blogspam.
- No Referral Linking.
- Keep piracy discussion off of this community
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This does seem tenuous, yes.
If they want to experiment with AI a bit and try to find a use-case for it, a bit of R&D every now and then can be good, right? But the way to do that would be to temporarily rent a VPS with a GPU. From AWS or whatever. That way when/if it looks good they'll know roughly what size hardware to buy (way more than a 5070 ti most likely). Also if it doesn't pan out all you've lost is a couple of hundred $ on one month's rental and you can close it down and move on.
Or just use the OpenAI API, that way you don't need to figure out how to run a model at all and can just concentrate on the data integration to see if that's viable.
That's what I was thinking too, I'm pretty sure just using the API for those models or a VM with a GPU like you said would be the only viable options