In my masters degree I always ran many computations as did all my peers

The reality is that more of us are than not are using huge HPC clusters / cloud computing for many hours on each project

The industry is just GPUs going BRRR

I’m wondering if this has potential implications for ML in society as AI/ML becomes more mainstream

I could see this narrative being easily played in legacy media

Ps - yeah while there are researchers trying to make things more efficient, the general trend is that we are using more GPU hours per year in order to continue innovation at the forefront of artificial inference

  • CatalyticDragon@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    No reason to assume so. The largest players in AI/cloud, Google and Microsoft, are firmly on track to become carbon neutral and make significant investments in renewable energy.

    Using energy isn’t the same thing as creating emissions – it depends on your source.

    Machine learning also has the ability to streamline many energy intensive operations. One recent example is DeepMind generating an accurate 10-day weather forecast in under a minute which used to take hours of computation.

    Or significantly speeding up drug discovery and materials research cutting out lengthy rounds of experimentation.