this post was submitted on 16 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 11 months ago
MODERATORS
 

In my masters degree I always ran many computations as did all my peers

The reality is that more of us are than not are using huge HPC clusters / cloud computing for many hours on each project

The industry is just GPUs going BRRR

I’m wondering if this has potential implications for ML in society as AI/ML becomes more mainstream

I could see this narrative being easily played in legacy media

Ps - yeah while there are researchers trying to make things more efficient, the general trend is that we are using more GPU hours per year in order to continue innovation at the forefront of artificial inference

you are viewing a single comment's thread
view the rest of the comments
[–] LanchestersLaw@alien.top 1 points 10 months ago

Training models uses lots of energy, but so does every other human activity. Even for large companies like Google.

Google used 15,439 GW hrs in 2020. The average per capita US energy consumption is 311 GJ. That comes out to the energy equivalent of 178,715 people. Google had 135,300 employees in 2020. Barely above average energy usage and probably below average for the income google makes. Those 135,300 employees probably easily exceed the company’s energy usage with their normal household spending.