Home > Article > Technology peripherals > 5 times larger than ChatGPT Intel launches Aurora genAI large model: 1 trillion parameters
ChatGPT has been very popular in the past few months, leading all major technology giants to develop their own large AI models. Now Intel has also joined the fray and launched the Aurora genAI model with up to 1 trillion parameters.
The current parameter size of the ChatGPT 3.5 model is said to be 175 billion, which means that the Aurora genAI model is at least 5 times larger than it. Currently, many companies promote their models as trillion parameter level, but there is a lot of water. , Intel’s trillion-parameter model is much more credible.
Because the Aurora genAI model is equipped with the Aurora supercomputer developed by Intel for the Argonne National Laboratory in the United States, its performance has reached 20 billion times, which is twice that of the current TOP500 supercomputer champion Frontier, which is The world’s first exascale supercomputer.
Aurora supercomputer has a total of 10624 nodes, each node has 2 sets of CPUs and 6 sets of GPUs. The CPU is based on Sapphire Rapids-SP architecture. Total 21,248, the GPU is Intel’s strongest Ponte Vecchio, with a total of 63,744.
There are also 10.9PB of DDR5 memory, 1.36PB of HBM CPU memory, and 8.16PB of HBM GPU memory. The peak bandwidths can reach 5.95PB/s, 30.5PB/s, and 208.9PB/s respectively.
The storage system also has a capacity of 230PB and a bandwidth of 30TB/s. Just looking at the parameters, the Aurora supercomputer is really a dream.
The Aurora genAI model runs on the Aurora supercomputer. The 1 trillion parameter capability is very good and powerful, but it is different from ChatGPT. Mainly It is used in scientific computing fields, including biology, cancer, atmospheric science, astronomy, polymer chemistry and other fields.
The above is the detailed content of 5 times larger than ChatGPT Intel launches Aurora genAI large model: 1 trillion parameters. For more information, please follow other related articles on the PHP Chinese website!