search
HomeTechnology peripheralsAIHow to understand the new Moore's Law proposed by OpenAI? China's hidden computing power giant has something to say

This article is reprinted with the authorization of AI New Media Qubit (public account ID: QbitAI). Please contact the source for reprinting.

ChatGPT is popular all over the world, and everything related to it is at the forefront.

OpenAI CEO Sam Altman’s seemingly casual words have become the focus of heated discussion:

A new version of Moore’s Law is coming soon, universe Intelligence in China doubles every 18 months.

How to understand the new Moores Law proposed by OpenAI? Chinas hidden computing power giant has something to say

Some people speculate that it may refer to the parameters of the neural network, but it does not match the data of the past few years.

Some people understand it as the ability of all intelligent agents, including humans and AI, but how to measure this indicator has become a problem.

There are also many scholars and netizens who do not agree with this judgment. IBM scientist Grady Booch said that this is nonsense, and it became the number one hot comment.

How to understand the new Moores Law proposed by OpenAI? Chinas hidden computing power giant has something to say

To these discussions, Altman simply responded, “Not yet.”

How to understand the new Moores Law proposed by OpenAI? Chinas hidden computing power giant has something to say

#But no matter what, behind the rapid development of AI, computing power is a clear and measurable indicator and an essential condition.

As early as 2018, OpenAI also published another view similar to Moore's Law:

From AlexNet in 2012 to AlphaGo Zero at the end of 2017, The computing power required to train the largest AI model doubles every 3.4 months.

How to understand the new Moores Law proposed by OpenAI? Chinas hidden computing power giant has something to say

Not long after, their own release of GPT-3 shortened this number to 2 months. Ichiban.

In the ChatGPT era, it is not just AI training computing power that needs to be considered. With 100 million monthly active users worldwide and access to Microsoft Bing search and other products, AI inference computing power is also an important support.

The most direct provider of computing power for ChatGPT is Microsoft Azure cloud, which has been OpenAI’s exclusive cloud provider since its first investment of US$1 billion in OpenAI in 2019.

When the latest additional investment of US$10 billion was made, both parties emphasized this point again in their statements.

How to understand the new Moores Law proposed by OpenAI? Chinas hidden computing power giant has something to say

Exclusive supply means that OpenAI’s development infrastructure is all built on the Azure cloud, using the tool chain provided by the Azure cloud.

also means that the upcoming army of AI entrepreneurs can only choose Azure cloud if they want to build on ChatGPT.

Correspondingly, Google AI is backed by its own Google Cloud, and Amazon, the number one in the cloud computing industry, also urgently approached HuggingFace to cooperate in deploying the open source large model Bloom.

……

Cloud computing is not the source of computing power. Just as ChatGPT became popular, a number of companies with outstanding performance appeared on the computing power supply side.

You may be able to think of some players right away, but there are also some players who may be unexpected because they are too well-known in other businesses before.

Among them is a player from China.

The new Moore’s Law cannot be separated from computing power

When the last wave of AI applications represented by image recognition and recommendation algorithms were implemented to promote the development of the digital economy, computing power was transformed from a technical term It has become an economic term and has been repeatedly emphasized by industry and academia.

When this wave of generative AI ignited global enthusiasm, computing power was once again upgraded to be relevant to the work and life of ordinary people and was perceived by the public.

How much does ChatGPT cost?

SemiAnalysis, a semiconductor industry information organization, has made an estimate: measured by GPU, approximately 3,617 Nvidia HGX A100 servers are needed, which is 28,936 A100s; measured by funds, the cost of each user question is 0.36 cents. One day costs $690,000.

The conclusion drawn is: If ChatGPT were to bear all the traffic of Google searches, a total of 4.1 million NVIDIA A100 GPU would be needed.

So naturally, NVIDIA is the most famous profiteer in the popularity of ChatGPT. NVIDIA currently accounts for about 75%-80% of the AI ​​chip market. Every hot search on ChatGPT is the sound of gold coins arriving in Lao Huang’s account. With the support of this wave of enthusiasm, as of the release of the latest financial report, NVIDIA’s stock price is at In 2023, it will increase by 45%.

CEO Huang Renxun also made his latest statement on the conference call:

"Generative AI has given global companies a sense of strategic urgency in developing and deploying artificial intelligence."

And this is just the beginning. According to Citibank, Nvidia’s GPU sales will reach US$3 billion to US$11 billion in the next 12 months.

In addition to Nvidia, the winner of the ChatGPT bonus from the GPU upstream is TSMC, which is located at the chip manufacturing level.

Recently, there have been reports that NVIDIA, AMD, and Apple have recently placed emergency orders for TSMC, hoping to seize future production capacity.

Previously, TSMC’s main revenue pillar for many years has been the smartphone business. However, in 2022, under the dual impact of bleak smartphone sales and the outbreak of artificial intelligence, the high-performance computing business overtook it for three consecutive quarters-perhaps It is also the harbinger of a turning point in the times.

More importantly, TSMC has actually made this change clear. In recent public statements, TSMC has given optimistic expectations for the coming year, and its expression is also very intriguing: Although the global semiconductor industry is facing a recession, its full-year performance is still expected to grow slightly.

Of course, if it is deduced more simply and crudely, in fact, when NVIDIA rises, TSMC's performance will not be bad. After all, the role of the industry chain and the relationship between supply and demand are there.

So based on this supply and demand, will the computing power dividends brought by ChatGPT also go to cloud computing and cloud service manufacturers?

Yes, not entirely.

Amid the popularity of ChatGPT, many analyzes pointed to cloud computing, but some analysts quickly focused on the intermediate link - the intermediate link from GPU to cloud computing services - the server.

If a chip wants to truly play its role, it cannot bypass the server: how the chips are interconnected at high speed, how the entire cabinet controls energy consumption and heat dissipation, and how the system design adapts to different workloads, all affect the final performance and efficiency.

Gartner predicts that by 2025, the global server market will reach US$135 billion, among which AI servers and AI cloud services will explode.

It is in this link, in the direction of AI servers, that the computing power participants from China mentioned at the beginning of the article are hidden.

Lenovo Group, Lenovo.

Perhaps before, you knew Lenovo Group more from PC, but the essence of PC is also personal computing power.

Lenovo has gone through such a road from PC to high-performance computing to broader computing infrastructure.

This is evidenced by the most intuitive financial report data. According to the third quarter financial report released by Lenovo Group (October-December 2022), its ISG (Infrastructure Solutions Group) infrastructure solutions business saw revenue increase by 48% and operating profit soared by 156%.

This is in sharp contrast to the DigiTimes statistics of global server shipments falling 7.1% year-on-year and 4.3% month-on-month.

Lenovo Group ISG provides infrastructure output and solutions with computing power as the core. Servers are one of the most important businesses.

And it is more directly broken down into the server business. Revenue in this quarter increased by 35% year-on-year, and it has become the third largest server provider in the world - the computing power dividend from GhatGPT is indispensable.

But the sudden popularity of ChatGPT actually only pushed Lenovo Group’s hidden role as a computing power winner to the forefront.

How does Lenovo become the “invisible” champion of computing power?

Yang Yuanqing, chairman and CEO of Lenovo Group, said that the universal computing power infrastructure needed behind ChatGPT happens to be Lenovo’s strength that has been deployed for many years.

Lenovo has already foreseen the demand for computing equipment and proposed a new IT architecture of "end-edge-cloud-network-intelligence".

Last year, Lenovo further proposed universal computing power. That is to say, with the explosive growth of data, we have entered a new era of digitalization and intelligence. The demand for computing is mushrooming. "Individual combat" equipment and localized data centers are far from being able to meet the needs of computing anytime and anywhere. Demand requires a new IT architecture of "end-edge-cloud-network-intelligence" to provide users with ubiquitous universal computing power.

It can be seen that what Lenovo wants to be is not a "water source" for producing computing power, but a "water delivery person" - integrating chips into servers that can efficiently release computing power and are less likely to be perceived by the outside world. .

In this role and positioning, Lenovo Group is very "strong" - this strength lies in two aspects.

The first is server product performance; the second is quantity scale.

In this field, the global high-performance computing list TOP500 is the best reference for observing server computing power. If you open the latest issue of the global high-performance computing list TOP500 (November 2022), you can perceive the above. Talk about the strength of Lenovo Group.

The market share accounts for 32%, ranking first in the world with a supply of 160 units, even leading the second place Hewlett-Packard Group by nearly 12 percentage points.

How to understand the new Moores Law proposed by OpenAI? Chinas hidden computing power giant has something to say


How to understand the new Moores Law proposed by OpenAI? Chinas hidden computing power giant has something to say

# Moreover, Lenovo Group’s computing power supplier also has customers Compositional universality and diversity.

In response to a question from Nihon Keizai Shimbun, Kirk Skaugen, executive vice president of Lenovo Group, revealed in Versailles style:

Eight of the top 10 public cloud providers in the world are Lenovo Group's customers.

In other words, Lenovo servers have been recognized by customers around the world, which is not easy for domestic server manufacturers.

Among these customers, outside speculation includes the second-ranked Microsoft Azure cloud in the cloud field - the acceleration engine behind ChatGPT, and the "financier father".

When Microsoft invested heavily to support OpenAI, there were comments that the shrewd Microsoft CEO Nadella would not use real money to invest in support. The core provision was equivalent cloud services, and he had obtained OpenAI equity. , realized the strategic layout of a top AI research institution, and brought customers and growth to its own Azure cloud. It simply killed two birds with one stone, and this business cannot be lost.

I just didn’t expect that the crazy OpenAI really made a name for itself with large models. It really ignited the global technology Internet with an accidental innovation of ChatGPT, and plunged all players into the excitement of large models and generative AI. .

Not only did Microsoft take the lead in benefiting from this, it also pushed its old rival Google into a corner, shaking up Google’s search cash cow. Nadella estimates that his dreams have been sweet lately.

The benefits of Lenovo Group, which has a close partnership with Microsoft and won the Microsoft Device Partner Award in 2022, have been directly reflected in its financial performance.

In addition, it is also this relationship that can also be understood as to why Microsoft CEO Nadella appeared at the 2021 Lenovo Group Innovation and Technology Conference. However, the announcement that year was too "industry jargon" and needs to be combined The ChatGPT bonus can only be truly understood today. The announcement at that time read as follows: In the future, the two parties will carry out more in-depth cooperation in the three fields of PC, cloud computing and edge computing, and services.

But even if Lenovo Group becomes the computing power provider for the ChatGPT boom, does this mean that this prospect is sustainable? After all, the cases of Microsoft and ChatGPT have their own particularities, and the Lenovo Group behind the scenes is lucky to have tasted the sweetness.

But judging from the development direction of the large model established by ChatGPT, the trend will continue, and the computing power supply side will also undergo directional changes.

According to a recent conclusion drawn by "Financial Eleven" based on the opinions of many infrastructure professionals, AI computing power will have a significant impact on the financial model of cloud computing, and behind it are Moore's Law-style indices. Growth logic:

First, AI computing power consumption and growth rate will be much greater than general computing power consumption. Data from the Academy of Information and Communications Technology in 2022 show that China’s general computing power will reach 95 EFlops in 2021, with a growth rate of 24% and a share of 47%. The scale of intelligent computing power is 104EFlops, with a growth rate of 85% and a share of over 50%. This means that intelligent computing power will become a new growth point for cloud computing.

Secondly, the price of intelligent computing power is higher than that of general computing power. The price of general computing power is constantly decreasing, and due to the relative scarcity of AI computing power, the price is currently being pushed up. Theoretically, the gross profit margin of AI computing power that achieves large-scale operations is more than 10% higher than that of general computing power.

Third, the use of large models of AI computing power in vertical industries can allow new application models to emerge. For example, in finance, autonomous driving, medical research and development, and smart manufacturing scenarios, companies usually need to purchase the company's AI large model open source platform and tune small models suitable for their own business. This kind of PaaS-based AI service can bring more than 60% of gross profits. This is currently the best path to deepen the use of AI computing power/AI models.

This kind of change in the direction of AI computing power can also have more specific impacts on the Chinese market:

It is undeniable that the AI ​​computing power of Chinese cloud vendors, large There is a big gap between the model and Microsoft. The reason is that there are gaps in computing power scale, data scale, and model accuracy. Taking the scale of computing power as an example, the intelligent computing cluster supporting ChatGPT requires at least tens of thousands of NVIDIA GPU A100 graphics cards. A complete model training costs more than $12 million.

The intelligent computing cluster behind ChatGPT costs more than 1 billion yuan to purchase GPU graphics cards alone. There are currently no more than three domestic companies that can support similar infrastructure. The data centers of Chinese cloud vendors are usually equipped with only thousands of GPU graphics cards. The reason is that the cost of purchasing a top-level Nvidia GPU is as high as 80,000 yuan. A server usually requires 4-8 GPUs, and the cost of a GPU server usually exceeds 400,000 yuan. The average domestic server price is 40,000-50,000 yuan. This means that the cost of a GPU server is more than 10 times that of an ordinary server.

So this wave of large model-driven technological changes can basically deduce the challenges and opportunities for Chinese computing power supply players.

First of all, it must have a complete, mature and stable intelligent computing cluster infrastructure.

Secondly, in addition to comprehensive performance, related products also need to have energy efficiency advantages - environmental protection and cost saving.

After clarifying these two conclusions, we can probably understand the reason why Lenovo Group, as the hidden winner behind the ChatGPT bonus, will continue to have sustainable computing power prospects.

If computing power ushered in a new Moore's Law, the issue of energy consumption cannot be ignored. The main person in charge of the High Technology Department of the National Development and Reform Commission said in 2021 that the annual electricity consumption of my country's data centers accounts for about 2% of society’s electricity consumption.

To put it more intuitively, "the annual electricity consumption exceeds the power generation of the Three Gorges Dam." Moore's Law is an exponential growth, and the power generation of the Three Gorges Dam cannot grow exponentially.

Lenovo Group has developed the leading "Poseidon" warm water water cooling technology, which uses 50°C warm water to take away heat through flow circulation, eliminating the need for a chiller and heat exchangers, can also use waste heat to heat buildings, annual electricity bills are saved and emissions are reduced by more than 42%

In the latest global HPC Green500 list, the Henri system delivered by Lenovo Group With certified performance of 6.5091 billion floating-point operations per watt of power consumption, it not only meets domestic and international requirements, but has even become the world's most energy-efficient high-performance computing system.

In fact, Lenovo is no longer the computer-only company in your stereotype.. Overall, while the computing infrastructure business is experiencing an explosion, Lenovo Group as a whole has gradually completed its transformation: the proportion of total revenue from businesses other than personal computers has exceeded 40%.

ChatGPT, large models, or even AI, the computing power dividend that is the first to surge in the technological innovation cycle is refreshing Lenovo Group, or more accurately, changing the computing power supply side. The hidden champion behind the scenes comes to the forefront.

The above is the detailed content of How to understand the new Moore's Law proposed by OpenAI? China's hidden computing power giant has something to say. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:51CTO.COM. If there is any infringement, please contact admin@php.cn delete
ai合并图层的快捷键是什么ai合并图层的快捷键是什么Jan 07, 2021 am 10:59 AM

ai合并图层的快捷键是“Ctrl+Shift+E”,它的作用是把目前所有处在显示状态的图层合并,在隐藏状态的图层则不作变动。也可以选中要合并的图层,在菜单栏中依次点击“窗口”-“路径查找器”,点击“合并”按钮。

ai橡皮擦擦不掉东西怎么办ai橡皮擦擦不掉东西怎么办Jan 13, 2021 am 10:23 AM

ai橡皮擦擦不掉东西是因为AI是矢量图软件,用橡皮擦不能擦位图的,其解决办法就是用蒙板工具以及钢笔勾好路径再建立蒙板即可实现擦掉东西。

谷歌超强AI超算碾压英伟达A100!TPU v4性能提升10倍,细节首次公开谷歌超强AI超算碾压英伟达A100!TPU v4性能提升10倍,细节首次公开Apr 07, 2023 pm 02:54 PM

虽然谷歌早在2020年,就在自家的数据中心上部署了当时最强的AI芯片——TPU v4。但直到今年的4月4日,谷歌才首次公布了这台AI超算的技术细节。论文地址:https://arxiv.org/abs/2304.01433相比于TPU v3,TPU v4的性能要高出2.1倍,而在整合4096个芯片之后,超算的性能更是提升了10倍。另外,谷歌还声称,自家芯片要比英伟达A100更快、更节能。与A100对打,速度快1.7倍论文中,谷歌表示,对于规模相当的系统,TPU v4可以提供比英伟达A100强1.

ai可以转成psd格式吗ai可以转成psd格式吗Feb 22, 2023 pm 05:56 PM

ai可以转成psd格式。转换方法:1、打开Adobe Illustrator软件,依次点击顶部菜单栏的“文件”-“打开”,选择所需的ai文件;2、点击右侧功能面板中的“图层”,点击三杠图标,在弹出的选项中选择“释放到图层(顺序)”;3、依次点击顶部菜单栏的“文件”-“导出”-“导出为”;4、在弹出的“导出”对话框中,将“保存类型”设置为“PSD格式”,点击“导出”即可;

GPT-4的研究路径没有前途?Yann LeCun给自回归判了死刑GPT-4的研究路径没有前途?Yann LeCun给自回归判了死刑Apr 04, 2023 am 11:55 AM

Yann LeCun 这个观点的确有些大胆。 「从现在起 5 年内,没有哪个头脑正常的人会使用自回归模型。」最近,图灵奖得主 Yann LeCun 给一场辩论做了个特别的开场。而他口中的自回归,正是当前爆红的 GPT 家族模型所依赖的学习范式。当然,被 Yann LeCun 指出问题的不只是自回归模型。在他看来,当前整个的机器学习领域都面临巨大挑战。这场辩论的主题为「Do large language models need sensory grounding for meaning and u

ai顶部属性栏不见了怎么办ai顶部属性栏不见了怎么办Feb 22, 2023 pm 05:27 PM

ai顶部属性栏不见了的解决办法:1、开启Ai新建画布,进入绘图页面;2、在Ai顶部菜单栏中点击“窗口”;3、在系统弹出的窗口菜单页面中点击“控制”,然后开启“控制”窗口即可显示出属性栏。

ai移动不了东西了怎么办ai移动不了东西了怎么办Mar 07, 2023 am 10:03 AM

ai移动不了东西的解决办法:1、打开ai软件,打开空白文档;2、选择矩形工具,在文档中绘制矩形;3、点击选择工具,移动文档中的矩形;4、点击图层按钮,弹出图层面板对话框,解锁图层;5、点击选择工具,移动矩形即可。

强化学习再登Nature封面,自动驾驶安全验证新范式大幅减少测试里程强化学习再登Nature封面,自动驾驶安全验证新范式大幅减少测试里程Mar 31, 2023 pm 10:38 PM

引入密集强化学习,用 AI 验证 AI。 自动驾驶汽车 (AV) 技术的快速发展,使得我们正处于交通革命的风口浪尖,其规模是自一个世纪前汽车问世以来从未见过的。自动驾驶技术具有显着提高交通安全性、机动性和可持续性的潜力,因此引起了工业界、政府机构、专业组织和学术机构的共同关注。过去 20 年里,自动驾驶汽车的发展取得了长足的进步,尤其是随着深度学习的出现更是如此。到 2015 年,开始有公司宣布他们将在 2020 之前量产 AV。不过到目前为止,并且没有 level 4 级别的 AV 可以在市场

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

R.E.P.O. Energy Crystals Explained and What They Do (Yellow Crystal)
2 weeks agoBy尊渡假赌尊渡假赌尊渡假赌
Repo: How To Revive Teammates
1 months agoBy尊渡假赌尊渡假赌尊渡假赌
Hello Kitty Island Adventure: How To Get Giant Seeds
1 months agoBy尊渡假赌尊渡假赌尊渡假赌

Hot Tools

Dreamweaver Mac version

Dreamweaver Mac version

Visual web development tools

Atom editor mac version download

Atom editor mac version download

The most popular open source editor

WebStorm Mac version

WebStorm Mac version

Useful JavaScript development tools

VSCode Windows 64-bit Download

VSCode Windows 64-bit Download

A free and powerful IDE editor launched by Microsoft

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor