Home > Article > Technology peripherals > Researcher: AI model inference consumes more power, and industry electricity consumption in 2027 will be comparable to that of the Netherlands
IT House News on October 13, "Joule", a sister journal of "Cell", published a paper this week called "The growing energy footprint of artificial intelligence."
Through inquiry, we learned that this paper was published by Alex De Vries, the founder of the scientific research institution Digiconomist. He claimed that the reasoning performance of artificial intelligence in the future may consume a lot of electricity. It is estimated that by 2027, the electricity consumption of artificial intelligence may be equivalent to the electricity consumption of the Netherlands for a year
Alex De Vries said that the outside world has always believed that training an AI model is "the most power-consuming stage of AI." However, Vries cited reports from SemiAnalysis and Google and pointed out that the "inference link" consumes more power. Since 2019, In 2021, 60% of AI-related energy consumption is expected to come from model inference.
▲ Picture source Related papers
Alex De Vries also calculated based on the 18.3 TWh power consumption disclosed by Google in 2021, claiming that AI accounted for 10%-15% of Google’s overall power consumption at that time, and after Google fully implemented “AI search” , the company’s power consumption for AI may reach up to 27.4 TWh, which is close to the electricity used by Ireland in an entire year (29.3 TWh).
With NVIDIA partner Taiwan Semiconductor Manufacturing Company (TSMC) planning to start mass production of a new CoWoS factory in 2027, the entire market is expected to change. According to estimates by Alex De Vries, by then, the total power consumption of all AI servers delivered by Nvidia will reach 85-134 TWh, which is equivalent to the electricity usage of the Netherlands for a year
▲ Picture source Related papers
Alex De Vries believes that the artificial intelligence industry should improve hardware efficiency and improve the efficiency of model algorithms to improve the energy consumption of the industry. He also suggested that developers should not only focus on optimizing artificial intelligence, but also consider the necessity of using artificial intelligence to reduce energy costs in the artificial intelligence industry
In order to rewrite the content without changing the original meaning, the language needs to be rewritten into Chinese. There is no need for the original sentence
The above is the detailed content of Researcher: AI model inference consumes more power, and industry electricity consumption in 2027 will be comparable to that of the Netherlands. For more information, please follow other related articles on the PHP Chinese website!