Tech

These simple changes could make AI research much more energy efficient


Are from first sheet of paper The study of the technology’s impact on the environment was published three years ago, a movement has grown among researchers to self-report the energy consumed and the emissions generated from their work. surname. Having the right numbers is an important step towards making changes, but actually gathering those numbers can be challenging.

Jesse Dodge, a research scientist at Allen Institute for AI in Seattle. “The first step for us, if we want to make progress in reducing emissions, is that we have to get a good measurement.”

To that end, the Allen Institute recently partnered with Microsoft, the face-hugging AI company, and three universities to create a tool to measure electricity usage of any machine learning program running on Azure, Microsoft’s cloud service. With it, Azure users building new models can see the total amount of power consumed by graphics processing units (GPUs) — computer chips that specialize in running parallel calculations — during every phase of their project. their project, from choosing a model to training and putting it to use. It is the first major cloud provider to give users access to information about the energy impact of their machine learning programs.

While tools exist to measure energy use and emissions from machine learning algorithms running on local servers, those tools do not work when researchers use the services. cloud provided by companies like Microsoft, Amazon, and Google. Those services do not provide users with direct visibility into the GPU, CPU, and memory resources their operations use — and existing tools, such as Carbontracker, Experiment Tracker, EnergyVis, and CodeCarbon, need those values ​​to provide an accurate estimate.

The new Azure tool, launching in October, now reports energy usage, not emissions. So Dodge and other researchers figured out how to map energy use with emissions, and they showed a companion paper about that job at FAccT, a major computer science conference, in late June. The researchers used a service called Watttime to estimate the zip code-based emissions of cloud servers running 11 machine learning models.

They found that emissions could be significantly reduced if the researchers used servers in specific geographical locations and at certain times of the day. Emissions from training small machine learning models can be reduced by up to 80% if training starts at a time when there is more renewable electricity on the grid, while emissions from large models can be reduced more than 20% if the training job is paused when regenerating. Electricity is scarce and is restarted when more abundant.



Source link

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button