In my masters degree I always ran many computations as did all my peers
The reality is that more of us are than not are using huge HPC clusters / cloud computing for many hours on each project
The industry is just GPUs going BRRR
I’m wondering if this has potential implications for ML in society as AI/ML becomes more mainstream
I could see this narrative being easily played in legacy media
Ps - yeah while there are researchers trying to make things more efficient, the general trend is that we are using more GPU hours per year in order to continue innovation at the forefront of artificial inference
It’s an interesting question, imo if a narrative like this were pushed by the media it would be unlikely to gain any traction. Energy usage whether for AI development or for cooking hot pockets isn’t inherently bad for the environment. Most methods of power generation have some impacts regarding CO2 but environmental impact overall is more a question of grid efficiency and sustainable generation capacity rather than some arbitrary increase of energy demand on the grid.