Home > Article > Technology peripherals > How Green AI Addresses Impact on Climate Change
The development of computationally intensive technologies such as machine learning creates a high carbon footprint and contributes to climate change. In addition to rapid growth, Machinery has an expanding portfolio of green AI tools and technologies to help offset carbon emissions and provide a more sustainable path to development.
The environmental costs are high, according to research published last month by Microsoft and the Allen Institute for Artificial Intelligence, along with co-authors from Hebrew University, Carnegie Mellon University and the AI community hugsFace. Will Buchanan, product manager for Azure Machine Learning at Microsoft, a member of the Green Software Foundation, and a co-author of the study, said that the study extrapolated the data to show that for a 6 billion parameter ML model (a large language model), One training instance produces as much carbon dioxide as burning all the coal in a large train car.
Forrester Research analyst Abhijit Sunil said that in the past, code was optimized in embedded systems that were constrained by limited resources, such as Cell phone, refrigerator or satellite. However, emerging technologies such as AI and ML are not subject to these limitations, he said.
“When we have seemingly unlimited resources, the priority is to write as much code as possible,” Sunil said.
Green artificial intelligence, the process of making artificial intelligence development more sustainable, is emerging as a possible solution to the problem of algorithmic power consumption. "This is all about reducing the hidden costs of technology development itself," Buchanan said.
Abhishek, founder and principal researcher of the Montreal Institute for Ethics in Artificial Intelligence and chairman of the Green Software Foundation's standards working group The starting point for any developer, said Abhishek Gupta, is to understand whether artificial intelligence is right for the job and figure out why machine learning is being deployed in the first place.
“You don’t always need machine learning to solve a problem,” Gupta said.
Gupta said developers should also consider conducting a cost-benefit analysis when deploying ML. For example, if machine learning is used to increase satisfaction with a platform from 95 percent to 96 percent, that may not be worth the extra cost to the environment, he said.
Once developers decide to use AI, choosing to deploy models in carbon-friendly regions will have the greatest impact on operational emissions, boosting the software’s carbon intensity rate, Buchanan said. Reduced by approximately 75%.
Buchanan said: "This is the most influential lever that any developer can use today."
Gupta gave an example: Developers can choose to build in Canada Operates in Quebec rather than in the U.S. Midwest, where electricity comes primarily from fossil fuels. More than 90% of the electricity in Quebec, Canada, comes from hydroelectric power.
When deciding where machine learning jobs should run, companies must also consider factors beyond the type of energy source. In April 2021, Google Cloud launched a Green Zone Selector to help companies evaluate cost, latency, and carbon footprint when choosing where to operate. But not all cloud providers have such tools readily available, Buchanan said.
To solve this problem, the Green Software Foundation is developing a new tool called Carbon AwareSDK that will recommend the best regions to launch resources from, he said. An alpha version should be available in the next few months.
If the only available computers are in areas with poor power, Gupta said developers can use federated learning-style deployments, in which training is performed in a distributed fashion across power lines. performed on all devices present in the system. But federated learning may not be suitable for all workloads, such as those that must comply with legal privacy considerations. Another option, Gupta said, is for developers to use tinyML, which shrinks machine learning models through quantization, knowledge distillation and other methods. The goal, he said, is to minimize models so they can be deployed in a more resource-efficient way, such as on edge devices. But since these models provide limited intelligence, they may not be suitable for complex use cases.
“The trend across the industry is to think bigger is better, but our research shows you can counter that and make it clear you need to choose the right tool for the job,” Buchanan said.
Consumption metrics could be the solution
For example, Microsoft last year made energy consumption metrics available in Azure Machine Learning, allowing developers to pinpoint their most energy-intensive work. These metrics focus on the power-hungry GPU, which is faster than the CPU but consumes more than 10 times the energy. GPUs, which are often used to run AI models, are often the biggest culprit when it comes to power consumption, Buchanan said.
However, there is still a need for more interoperable tools, Buchanan said, referring to the fragmented green AI tools currently available. "The Green Software Foundation is working on one thing," he said, "but I think cloud providers need to make coordinated investments to improve energy efficiency."
The ultimate goal, Gupta said, is to trigger behavior change so that green AI practices become the norm. "We're not just doing this for accounting purposes," he said.
The above is the detailed content of How Green AI Addresses Impact on Climate Change. For more information, please follow other related articles on the PHP Chinese website!