Home > Article > Technology peripherals > When ChatGPT launches a paid version, someone will have to pay for the computing power.
On Wednesday local time, OpenAI launched ChatGPT Plus, a paid subscription version of ChatGPT, which costs US$20 per month (approximately 135 yuan based on the exchange rate on February 2).
Although OpenAI stated that it will continue to provide a free version, and paid projects will also better "help as many people as possible use free services." However, the New York Times also pointed out that "during peak hours, the number of visitors to the free version will be limited." Obviously, the traffic restrictions, quantity restrictions and other measures that often appear in the free version of ChatGPT are expected to remain in place.
Just as we celebrate the Chinese New Year, the generative AI chatbot ChatGPT becomes popular. Only two months after it was opened to the public, ChatGPT has already achieved the "small goal of AI student life" - monthly active users of over 100 million. In the past two months, netizens all over the world have been enthusiastically "training" this intelligent chatbot, but the first one who couldn't stand it was the owner of ChatGPT. For longer-term development, OpenAI announced the paid subscription version of ChatGPT Plus.
Charging is an inevitable choice for the long-term development of AI services. Previously, Greg Brockman, the president and chairman of OpenAI, had publicly posted a post asking for suggestions from users on "how to make money through ChatGPT."
The root cause is that behind the "increasing intelligence" of AI such as ChatGPT, huge costs are required. Among them, computing power cost is the most important, and it is also the part that cannot cut corners.
Data shows that the total computing power consumption of ChatGPT is approximately 3640PF-days. For a more intuitive comparison, we can review some recent data center construction news, such as a data center project with a total investment of 3.02 billion and a computing power of 500P. We will know that to support the operation of ChatGPT, at least It requires 7-8 such data centers to support it, and the investment in infrastructure is in the tens of billions. Of course, the infrastructure can be solved by renting, but the cost pressure brought by the demand for computing power is still huge. According to the analysis of Soochow Securities Research Report, the optimization of ChatGPT mainly comes from the increase of the model and the resulting increase in computing power. The number of parameters of GPT, GPT-2 and GPT-3 (the current open version is GPT-3.5) has increased from 117 million to 175 billion, and the amount of pre-training data has increased from 5GB to 45TB. The cost of a single training session for GPT-3 is Up to $4.6 million.
Computing power! Computing power! Computing power!
But no matter who participates, they need to answer a question: How to solve the cost of computing power?
Perhaps, the answer lies in China’s “numbering in the east and counting in the west” project that is being promoted in full swing.
The "Eastern Data and Western Computing" project builds a new nationwide integrated computing power network, optimizes the layout of data center construction, and guides the demand for computing power from the east to the west in an orderly manner, taking advantage of the resource advantages of the west to contribute to Digital China. The development provides low-carbon, low-cost and high-quality computing power.
For the AI industry, "numbering in the east and calculating in the west" can also become "numbering in the east and training in the west", that is, the huge demand for training computing power can be completely transferred to lower computing power costs, and at the same time, the scale is more advantageous in the Western Data Center.
Correspondingly, these data centers that carry intelligent training will also undergo targeted transformation to better adapt to the needs of intelligent training. For example, in terms of energy supply, heat dissipation structure, cabinet form, etc., they will be more suitable for the use of a large number of Intelligent training chip servers, etc.
From this, we can also see the new path for the future development of data centers. Applications such as "Eastern data and Western training", "Eastern data and Western rendering" and "Eastern data and Western storage" will become mainstream directions. Data center construction will also bid farewell to the cookie-cutter general era and enter a scene-guided and application-oriented " "Specialized" era.
The above is the detailed content of When ChatGPT launches a paid version, someone will have to pay for the computing power.. For more information, please follow other related articles on the PHP Chinese website!