These simple changes could make AI research more energy efficient

since first paper The study examining the technology’s impact on the environment was published three years ago, and researchers embarked on a campaign to self-report the energy consumed and the emissions produced by their work. Having accurate numbers is an important step in making a difference, but actually collecting those numbers can be a challenge.

“You can’t improve what you can’t measure,” said Jesse Dodge, a research scientist at the institute. Allen Institute for Artificial Intelligence in Seattle. “For us, if we want to make progress in reducing emissions, the first step is that we have to measure well.”

To that end, the Allen Institute recently created a partnership with Microsoft, artificial intelligence company Hugging Face, and three universities. A tool to measure electricity consumption Any machine learning program running on Microsoft’s cloud service, Azure. With it, Azure users building new models can see the total amount of power consumed by graphics processing units (GPUs) (computer chips designed to run computations in parallel) at each stage of the project, from selecting a model to training and using the model . It is the first major cloud provider to allow users to access energy impact information about its machine learning programs.

While tools already exist to measure the energy use and emissions of machine learning algorithms running on local servers, those tools don’t work when researchers use cloud services offered by companies like Microsoft, Amazon and Google. These services do not give users direct insight into the GPU, CPU, and memory resources their activities consume, and existing tools such as Carbontracker, Experiment Tracker, EnergyVis, and CodeCarbon require these values ​​to provide accurate estimates.

The new Azure tool, which debuted in October, now reports on energy usage, not emissions.So Dodge and other researchers figured out how to map energy use to emissions, and they came up with supporting documents work at that fact, a major computer science conference in late June.The researchers used a watt hour Emissions were estimated based on the zip codes of cloud servers running 11 machine learning models.

They found that emissions could be significantly reduced if researchers used servers in specific geographic locations and at specific times of day. Emissions from training small machine learning models can be reduced by up to 80% if training begins when more renewable electricity is available on the grid, while emissions from large machine learning models can be reduced by up to 80% if training is suspended when renewable energy is available Reduce power scarcity by over 20%, restart when it’s more abundant.

Source link