In my masters degree I always ran many computations as did all my peers

The reality is that more of us are than not are using huge HPC clusters / cloud computing for many hours on each project

The industry is just GPUs going BRRR

I’m wondering if this has potential implications for ML in society as AI/ML becomes more mainstream

I could see this narrative being easily played in legacy media

Ps - yeah while there are researchers trying to make things more efficient, the general trend is that we are using more GPU hours per year in order to continue innovation at the forefront of artificial inference

  • the_fart_king_farts@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    I think server clusters cost around 2-3 % of energy already. That is about airlines levels of energy usage.

    Analogue chips are prob. going to help this not accelerate more than absolutely needed the next decade or two.