Home >Technology peripherals >AI >Google claims that its fourth-generation TPU is better than Nvidia GPU, but the industry believes that the latter's leadership position is difficult to shake

Google claims that its fourth-generation TPU is better than Nvidia GPU, but the industry believes that the latter's leadership position is difficult to shake

PHPz
PHPzforward
2023-05-09 13:10:12891browse

Google recently claimed that the company can surpass Nvidia in AI supercomputing power. However, the industry believes that this news is not likely to cause much trouble to Nvidia, the market leader.

Google recently released a research report stating that AI supercomputers powered by its TPU have better performance and energy efficiency than equivalent machines running Nvidia A100 GPUs.

Google has strung together 4,000 fourth-generation TPUs to build a supercomputer that it says will run 1.7 times faster and more efficiently than equivalent machines using Nvidia A100 GPUs Out 1.9 times.

Google claims that its fourth-generation TPU is better than Nvidia GPU, but the industry believes that the latters leadership position is difficult to shake

Google's TPU v4-powered supercomputer running in Oklahoma

Nvidia benefits from generative AI boom, users are excited Demand for A100 GPUs surged. A100 is mainly used for training large language AI models, such as OpenAI’s GPT-4.

The industry believes that with Nvidia’s new GPU H100 about to be launched, the company is unlikely to be worried about Google’s achievement.

Google has been using TPU v4 internally since 2020 and made the chips available to its Google Cloud Platform customers last year. The company's largest large-scale language model, PaLM, is trained using two 4000 TPU supercomputers.

Google researcher Norm Jouppi and distinguished engineer David Patterson explained in a blog post about the system: "Optical circuit switches make it easy to bypass faulty components. This flexibility even Allowing us to change the topology of supercomputer interconnections to accelerate the performance of machine learning models."

Mike Orme, who is responsible for semiconductor market development at GlobalData, said that the use of optical circuit switches is the key to Google's improvement in supercomputer performance. He explained: "While each TPU does not process as fast as the best Nvidia AI chips, the optical circuit switching technology Google uses to connect the chips and pass data between them makes up for the performance difference."

Nvidia technology has become the gold standard for training AI models. Some large technology companies have purchased thousands of Nvidia A100 GPUs in an attempt to surpass their opponents in the AI ​​technology competition. The supercomputer used by OpenAI to train GPT-4 is equipped with 10,000 Nvidia GPUs, each retailing for up to $10,000.

Google claims that its fourth-generation TPU is better than Nvidia GPU, but the industry believes that the latters leadership position is difficult to shake

Nvidia A100 GPU

The latest news shows that the A100 will soon be replaced by Nvidia’s latest model H100. In an inference benchmark report recently released by MLPerf, an open AI engineering alliance that tracks processor performance, the H100 topped the industry list in terms of power and efficiency.

Google claims that its fourth-generation TPU is better than Nvidia GPU, but the industry believes that the latters leadership position is difficult to shake

Nvidia H100 GPU

Nvidia claims that the H100 GPU runs nine times faster than the A100 GPU that Google used for comparison. This speed advantage will eliminate the advantages brought by Google's optical circuit switching technology.

90% of Google’s AI training uses TPU, but despite its powerful chip, Orme does not expect Google to push it to third parties because Google has not competed with Nvidia chips in the commercial AI chip market. Competing ambitions, its TPUs are earmarked for use in Google data centers or its AI supercomputers. ”

Why do few users outside of Google use this technology? Orme believes that it is because Google Cloud has a small share of the public cloud market. According to survey data released by Synergy Research Group, Google Cloud’s market The share is 11%, lagging behind AWS and Microsoft Azure's 34% and 21%.

At the same time, Google also reached an agreement with NVIDIA to provide H100 GPU computing power to Google Cloud customers, which reflects NVIDIA's future It will remain the market leader for some time and even Google can’t live without it.

The above is the detailed content of Google claims that its fourth-generation TPU is better than Nvidia GPU, but the industry believes that the latter's leadership position is difficult to shake. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete