seanpmassey

joined 11 months ago
[–] seanpmassey@alien.top 1 points 11 months ago

It's all of those. I'm not voting because I can't decide what the biggest reason is, and there is no "two or more of the above" options to pick from.

The only one I don't think applies to me is "Better Services" as many of the open-source/self-hosted solutions aren't necessarily better than commercial options. Or they're missing SSO support in the free/freemium/open-source tier.

[–] seanpmassey@alien.top 1 points 11 months ago

Point of pedantry- the Nano uses a Tegra X1 as its SoC. It has a Maxwell generation GPU, not Kepler.

The new Jetson Orin Nano uses an Ampere GPU.

[–] seanpmassey@alien.top 1 points 11 months ago (1 children)

It depends.

What is your budget? And what hardware/hypervisor do you have?

And what specifically are you looking to do with “generative AI?” Ugh…I hate that term.

There are two key things to keep in mind about rack-mount GPUs. First, you need servers that are specifically built to host most GPUs in the factory. Almost all of NVIDIA’s server-grade GPUs are passively cooled, so the servers need to have a fan configuration to cool the GPUs. And except for the lowest end server GPUs (P4/T4/A2/L4 - all Inference cards and over $1000 per card) which draw less than the 75 watts provided by the PCI slot, all of the GPUs require at least 150 watts, molex power connectors and higher wattage power supplies.

And most of the drivers and docker/kubernetes plugins for these GPUs are locked behind NVIDIA licensing.

You’d want something that is at least Pascal-generation, but the Turing or newer cards are better.

Your better bet is to get a rack-mount workstation (which is basically a server anyway) and stick a higher-end Quadro or GeForce 30x0 card in there.

Edit: I never answered what I have - an R730 factory built for GPUs with a pair of Tesla P4 cards. I originally built it to play with GPUs for VDI.

[–] seanpmassey@alien.top 1 points 11 months ago

Oh…wow. That’s quite the loaded question. How much time do we have? ::checks watch::

The short answer is that almost every technical skill I’ve learned or improved (and some non-technical ones like public speaking as well) has been a result of my home lab. I just needed the right push/motivation/use case to dive into it.

The first iteration of my home lab started 20 years ago while I was in college. I started my lab because I wanted more hands on experience, and my curiosity pushed me forward from there.

So…it really depends on what skills you want to develop and where you want to start your career. IT is a very large area.

The best thing you can do is find problems you have and use your lab to design and implement a solution.

In general, I would say the following:

  1. Troubleshooting- Build things in your lab just to break them. Learn how to figure out what you broke and how to solve the problem.
  2. Networking - Build a network. Understand how applications and services talk to each other. Learn a little about TCP/IP and basic routing. It doesn’t need to be complex (unless you want to go for your CCIE)
  3. Virtualization - Build out a small virtual environment. Use it to run a few applications or services for personal use. This is also good because you can put multiple services on the same piece of hardware.
  4. Share what you’re doing - A big part of IT is communications skills. Once you start doing something interesting, share it. Blog. Find user groups for the technology you’re interested in and talk about how you use your lab to learn it. Good communication skills will get you further than good technical skills.