Home > Article > Technology peripherals > Will AI require new data center infrastructure?
The data center infrastructure that has been expanded in recent years to support the explosive growth of cloud computing, video streaming and 5G networks will not be sufficient to support the next level of digital transformation that will officially begin with the widespread adoption of artificial intelligence. .
In fact, digital infrastructure for artificial intelligence will require a different cloud computing framework, which will redefine current data center networks, including some data center clusters. location and specific functions of these facilities.
In November, Amazon Web Services, the global leader in cloud computing, formed a partnership with StabilityAI, and Google reportedly has a chatgtp-type system called lambda. The search engine giant has asked the founder to pull Page and Sergey Brin to guide its release.
Last month, Meta announced that it would suspend the expansion of data centers around the world and reconfigure these server farms to meet the data processing needs of artificial intelligence.
The demand for data processing in artificial intelligence platforms is huge. The OpenAI creators of ChatGPT launched the platform in November last year. It will not be able to hitch a ride on Microsoft’s upcoming upgrade of the Azure cloud platform. Keep running.
ChatGPT may explain this better, but it turns out that the microprocessing “brain” of the AI platform—in this case, the data center infrastructure that supports this digital transformation—will be like Like the human brain, it is organized into two hemispheres or lobes. Yes, one leaf needs to be much stronger than the other.
One hemisphere of artificial intelligence digital infrastructure will serve so-called “training,” processing the computing power required to process up to 300B data points to create the word salad that ChatGPT generates. In ChatGPT, that's every pixel on the internet since Al Gore invented it.
The training leaf takes in data points and reorganizes them in the model, just like the brain's synapses. It is an iterative process in which the digital entity continues to refine its "understanding," essentially teaching itself to absorb a world of information and convey the essence of that knowledge in precise human grammar.
Training lobes requires powerful computing power and state-of-the-art GPU semiconductors, but little connectivity is currently required in data center clusters supporting cloud computing services and 5G networks.
Focusing on “training” the infrastructure for each AI platform will create huge demands for power, requiring data centers to be located near gigawatts of renewable energy, new liquid cooling systems to be installed, and new designed backup power and generator systems, as well as other new design features.
Artificial Intelligence Platforms The other hemisphere of the brain, a higher-functioning digital infrastructure known as “inference” mode supports interactive “generative” platforms that generate questions or instructions seconds after you input them. Within minutes, the query is processed into the modeled database and responds to you with convincing human syntax.
Today’s hyperconnected data center networks, such as North America’s largest data center cluster, Northern Virginia’s “Data Center Alley,” which also has the nation’s most extensive fiber optic network, can accommodate artificial intelligence brain “inference” leaves next-level connectivity needs, but these facilities will also need to be upgraded to meet the massive processing capacity required, and they will need to be closer to the substations.
The largest cloud computing providers are offering data processing power to artificial intelligence startups hungry for data processing power because these startups have the potential to become long-term customers. One VC investing in AI likened it to a “proxy war” between superpowers competing for AI supremacy.
There is a proxy war going on among the big cloud computing companies. They're really the only ones capable of building really big AI platforms with lots of parameters.
Emerging artificial intelligence chatbots are "terribly good", but they are not very sentient beings and cannot match millions of years of evolution that have produced billions of precise Sequences of synapses that fire within the same millisecond in the frontal lobe.
The above is the detailed content of Will AI require new data center infrastructure?. For more information, please follow other related articles on the PHP Chinese website!