Home  >  Article  >  Technology peripherals  >  NVIDIA is not the only beneficiary, AI training also benefits memory chip manufacturers

NVIDIA is not the only beneficiary, AI training also benefits memory chip manufacturers

WBOY
WBOYforward
2023-05-31 17:16:58977browse

NVIDIA is not the only beneficiary, AI training also benefits memory chip manufacturers

News on May 30, although the memory chip market is sluggish, there is a huge demand for artificial intelligence, which will benefit companies such as Samsung and SK Hynix.

In an earnings report on May 24, Nvidia’s market value surged by $207 billion in two days. Previously, the semiconductor industry was in a downturn, but the forecast of this financial report gave people great confidence and hope.

If the field of artificial intelligence takes off, traditional technology giants like Microsoft and start-ups such as OpenAI will seek help from companies such as Samsung and SK Hynix.

To process large amounts of data, analyze video, audio and text, and simulate human creativity, machine learning requires memory chips. In fact, AI companies may be buying more DRAM chips than ever before.

The reason why memory chips are in such great demand is simple: NVIDIA's AI chip is different from standard CPUs in that it can read and process large amounts of data at once, and then output the results immediately. But to take advantage of this power, computers need to receive data quickly and without delay. This is what memory chips do.

The processor cannot read data directly from the hard disk - it is too slow and inefficient. Chip manufacturers prefer to use precious internal space to increase computing power, so saving data in the chip's internal cache becomes the second choice. So the next best thing is to use DRAM.

When training complex chatbots, you may need to quickly access billions of messages for batch processing. If there isn't enough DRAM in the system, the computer will be significantly slower, which means even the best processor you spent $10,000 on won't be able to deliver its value. Each high-end AI processor may require up to 1TB of DRAM memory installed, which is 30 times more than a high-end laptop. Research firm TrendForce said this means that at some point this year, sales of DRAM chips used in servers will exceed those used in smartphones.

Artificial intelligence systems also need to be able to save output data quickly and in large quantities for rapid reading and writing, so they need to use NAND chips, which are also the chips used in smartphones and most laptops. Samsung, the world's leading company, ranks first in this field, followed by memory manufacturer Kioxia, which was spun off from Japan's Toshiba, and South Korea's SK Hynix.

Last quarter, DRAM and NAND memory chips contributed $8.9 billion in revenue to Samsung, far exceeding Nvidia’s data center business of $4.3 billion (which includes products for artificial intelligence). But it’s important to note that this was the worst quarter for Samsung’s memory division in seven years, and AI-related memory sales accounted for only a small portion of total revenue.

In the future, for every high-end artificial intelligence chip sold, a dozen more DRAM chips will be shipped, which means that the revenue of companies such as Samsung and SK Hynix will increase. Together, these companies control 95% of the DRAM market share. As Nvidia grows, so will they.

There is no doubt that the artificial intelligence revolution has arrived, and cool chatbots, ubiquitous search engines and high-performance processor manufacturers are among the biggest winners. Companies that mass-produce memory chips will not be left out. (Chenchen)

The above is the detailed content of NVIDIA is not the only beneficiary, AI training also benefits memory chip manufacturers. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete