In my masters degree I always ran many computations as did all my peers
The reality is that more of us are than not are using huge HPC clusters / cloud computing for many hours on each project
The industry is just GPUs going BRRR
I’m wondering if this has potential implications for ML in society as AI/ML becomes more mainstream
I could see this narrative being easily played in legacy media
Ps - yeah while there are researchers trying to make things more efficient, the general trend is that we are using more GPU hours per year in order to continue innovation at the forefront of artificial inference
Training models uses lots of energy, but so does every other human activity. Even for large companies like Google.
Google used 15,439 GW hrs in 2020. The average per capita US energy consumption is 311 GJ. That comes out to the energy equivalent of 178,715 people. Google had 135,300 employees in 2020. Barely above average energy usage and probably below average for the income google makes. Those 135,300 employees probably easily exceed the company’s energy usage with their normal household spending.