Home  >  Article  >  Technology peripherals  >  Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

王林
王林forward
2023-05-22 22:55:041019browse

After ChatGPT became popular, the AI ​​war between the two giants Google and Microsoft has burned into a new field-server chips.

Today, AI and cloud computing have become battlegrounds, and chips have also become the key to reducing costs and winning business customers.

Originally, major companies such as Amazon, Microsoft, and Google were all famous for their software. But now, they have spent billions of dollars on chip development and production.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

##AI chips developed by major technology giants

ChatGPT explodes, and major manufacturers start a chip competition

According to reports from foreign media The Information and other sources, these three major manufacturers have now launched or plan to release 8 server and AI chips for internal use Product development, cloud server rental, or both.

“If you can make silicon that’s optimized for AI, you’ve got a huge win ahead of you,” said Glenn O’Donnell, a director at research firm Forrester.

Will these huge efforts be rewarded?

The answer is, not necessarily.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

Intel, AMD, and Nvidia can benefit from economies of scale, but for the big tech companies, the story is far from Not so.

They also face thorny challenges, such as hiring chip designers and convincing developers to build applications using their custom chips.

However, major manufacturers have made impressive progress in this field.

According to published performance data, Amazon’s Graviton server chips, as well as AI-specific chips released by Amazon and Google, are already comparable in performance to traditional chip manufacturers.

There are two main types of chips developed by Amazon, Microsoft and Google for their data centers: standard computing chips and specialized chips used to train and run machine learning models. It is the latter that powers large language models like ChatGPT.

Previously, Apple successfully developed chips for iPhone, iPad and Mac, improving the processing of some AI tasks. These large manufacturers may have learned inspiration from Apple.

Among the three major manufacturers, Amazon is the only cloud service provider that provides two types of chips in servers. The Israeli chip designer Annapurna Labs acquired in 2015 laid the foundation for these efforts. Base.

Google launched a chip for AI workloads in 2015 and is developing a standard server chip to improve server performance in Google Cloud.

In contrast, Microsoft's chip research and development started later, in 2019. Recently, Microsoft has accelerated the launch of AI chips designed specifically for LLM. axis.

The popularity of ChatGPT has ignited the excitement of AI among users around the world. This further promoted the strategic transformation of the three major manufacturers.

ChatGPT runs on Microsoft's Azure cloud and uses tens of thousands of Nvidia A100s. Both ChatGPT and other OpenAI software integrated into Bing and various programs require so much computing power that Microsoft has allocated server hardware to internal teams developing AI.

At Amazon, Chief Financial Officer Brian Olsavsky told investors on an earnings call last week that Amazon plans to shift spending from its retail business to AWS, in part by investing in support for ChatGPT Required infrastructure.

At Google, the engineering team responsible for building the Tensor Processing Unit has been moved to Google Cloud. It is reported that cloud organizations can now develop a roadmap for TPUs and the software that runs on them, hoping to let cloud customers rent more TPU-powered servers.

Google: TPU V4 specially tuned for AI

As early as 2020, Google deployed the most powerful AI chip at the time in its own data center—— TPU v4.

However, it was not until April 4 this year that Google announced the technical details of this AI supercomputer for the first time.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

##Compared to TPU v3, TPU v4’s performance is 2.1 times higher, and after integrating 4096 chips , the performance of supercomputer has been improved by 10 times.

At the same time, Google also claims that its own chip is faster and more energy-efficient than Nvidia A100. For systems of comparable size, TPU v4 can provide 1.7 times better performance than NVIDIA A100, while also improving energy efficiency by 1.9 times.

For similarly sized systems, TPU v4 is 1.15 times faster than A100 on BERT and approximately 4.3 times faster than IPU. For ResNet, TPU v4 is 1.67x and about 4.5x faster respectively.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

Separately, Google has hinted that it is working on a new TPU to compete with the Nvidia H100. Google researcher Jouppi said in an interview with Reuters that Google has "a production line for future chips."

Microsoft: Secret Weapon Athena

In any case, Microsoft is still eager to try in this chip dispute.

Previously, news broke that a 300-person team secretly formed by Microsoft began developing a custom chip called "Athena" in 2019.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

According to the original plan, "Athena" will be built using TSMC's 5nm process, and it is expected that each chip can The cost is reduced by 1/3.

If it can be implemented on a large scale next year, Microsoft's internal and OpenAI teams can use "Athena" to complete model training and inference at the same time.

In this way, the shortage of dedicated computers can be greatly alleviated.

Bloomberg reported last week that Microsoft’s chip division had cooperated with AMD to develop the Athena chip, which also caused AMD’s stock price to rise 6.5% on Thursday.

But an insider said that AMD is not involved, but is developing its own GPU to compete with Nvidia, and AMD has been discussing the design of the chip with Microsoft because Microsoft Expect to buy this GPU.

Amazon: Already taking a lead

In the chip competition with Microsoft and Google, Amazon seems to have taken a lead.

Over the past decade, Amazon has maintained a competitive advantage over Microsoft and Google in cloud computing services by providing more advanced technology and lower prices.

In the next ten years, Amazon is also expected to continue to maintain an advantage in the competition through its own internally developed server chip-Graviton.

As the latest generation of processors, AWS Graviton3 improves computing performance by up to 25% compared to the previous generation, and floating point performance is improved by up to 2 times. It also supports DDR5 memory, which has a 50% increase in bandwidth compared to DDR4 memory.

For machine learning workloads, AWS Graviton3 delivers up to 3x better performance than the previous generation and supports bfloat16.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

Cloud services based on Graviton 3 chips are very popular in some areas, and have even reached a state of oversupply.

Another aspect of Amazon’s advantage is that it is currently the only cloud vendor that provides standard computing chips (Graviton) and AI-specific chips (Inferentia and Trainium) in its servers .

As early as 2019, Amazon launched its own AI inference chip-Inferentia.

It allows customers to run large-scale machine learning inference applications such as image recognition, speech recognition, natural language processing, personalization and fraud detection in the cloud at low cost.

The latest Inferentia 2 has improved the computing performance by 3 times, the total accelerator memory has been expanded by 4 times, the throughput has been increased by 4 times, and the latency has been reduced to 1/10.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

After the launch of the first generation Inferentia, Amazon released a custom chip designed mainly for AI training—— Trainium.

It is optimized for deep learning training workloads, including image classification, semantic search, translation, speech recognition, natural language processing and recommendation engines, etc.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

In some cases, chip customization can not only reduce costs by an order of magnitude, but also reduce energy consumption to 1/10 , and these customized solutions can provide customers with better services with lower latency.

It’s not that easy to shake NVIDIA’s monopoly

But so far, most AI workloads still run on GPUs, and NVIDIA produces most of them. Part of the chip.

According to previous reports, Nvidia’s independent GPU market share reaches 80%, and its high-end GPU market share reaches 90%.

In 20 years, 80.6% of cloud computing and data centers running AI around the world were driven by NVIDIA GPUs. In 2021, NVIDIA stated that about 70% of the world's top 500 supercomputers were driven by its own chips.

And now, even the Microsoft data center running ChatGPT uses tens of thousands of NVIDIA A100 GPUs.

For a long time, whether it is the top ChatGPT, or models such as Bard and Stable Diffusion, the computing power is provided by the NVIDIA A100 chip, which is worth about US$10,000 each.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

Not only that, A100 has now become the "main workhorse" for artificial intelligence professionals. The 2022 State of Artificial Intelligence report also lists some of the companies using the A100 supercomputer.

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

It is obvious that Nvidia has monopolized the global computing power and dominates the world with its own chips.

According to practitioners, the application-specific integrated circuit (ASIC) chips that Amazon, Google and Microsoft have been developing can perform machine learning tasks faster than general-purpose chips. , lower power consumption.

Director O'Donnell used this comparison when comparing GPUs and ASICs: "For normal driving, you can use a Prius, but if you have to use four-wheel drive on the mountain, use A Jeep Wrangler would be more suitable.”

Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game

But despite all their efforts, Amazon, Google and Microsoft all face the challenge of convincing developers to use these AI chips Woolen cloth?

Now, NVIDIA's GPUs are dominant, and developers are already familiar with its proprietary programming language CUDA for making GPU-driven applications.

If they switch to custom chips from Amazon, Google or Microsoft, they will need to learn a new software language. Will they be willing?

The above is the detailed content of Is Nvidia’s era of dominance over? ChatGPT sets off a chip war between Google and Microsoft, and Amazon also joins the game. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete