What GPUs are you putting in there? I’ve got a r720 and built a cluster with p4’s to test out.
this post was submitted on 18 Nov 2023
2 points (75.0% liked)
Homelab
371 readers
3 users here now
Rules
- Be Civil.
- Post about your homelab, discussion of your homelab, questions you may have, or general discussion about transition your skill from the homelab to the workplace.
- No memes or potato images.
- We love detailed homelab builds, especially network diagrams!
- Report any posts that you feel should be brought to our attention.
- Please no shitposting or blogspam.
- No Referral Linking.
- Keep piracy discussion off of this community
founded 1 year ago
MODERATORS
Currently P40, P100, MI25 each in pairs.
On one I had 2x P4 and 2x P40 but realized the mismatched VRAM was messing with loading LLMs Yanked the P4. May put in a R620.
Also had M40 but yanked due to slow speeds vs power consumed. At 24 cents a KWh if a pair of GPU runs 24/7 it adds about 300-350w on top of the base 100-140w. Or $53 a month. The spread between a M40 and P100 is paid in 2 months.
Other GPUs owned K40 and Grid K1 doesn’t have CUDA support in PyTorch releases and compiling is a pain.
Obtained some P102-100 on the Uber cheap. Supposedly 1080ti equivalent but only 5GB of VRAM. Haven’t tested yet due to dimensions and location of power connection doesn’t seem to jive with Dell R7x0 chassis.