Home > Article > Technology peripherals > Humans overtake AI: One week after DeepMind used AI to break the 50-year record for matrix multiplication calculation speed, mathematicians broke it again
On October 5, AlphaTensor was born, and DeepMind announced that it had solved an unsolved mathematical algorithm problem in the field of mathematics for 50 years, namely matrix multiplication. AlphaTensor becomes the first AI system to discover novel, efficient, and provably correct algorithms for mathematical problems such as matrix multiplication. The paper "Discovering faster matrix multiplication algorithms with reinforcement learning" also appeared on the cover of Nature.
However, AlphaTensor’s record only stood for a week before it was broken by human mathematicians.
In their latest work, researchers Manuel Kauers and Jakob Moosbauer from Johann Kepler University Linz, Austria, say they have broken AlphaTensor’s matrix multiplication record. They developed a method that performs a 5×5 matrix multiplication in 95 steps, one step less than AlphaTensor’s record of 96 steps and the previous record of 98 steps. A preprint of the paper was posted on arxiv on October 13.
##Paper address: https://arxiv.org/abs/2210.04045
The "FBHHRBNRSSSHK" in the title of the paper is actually the first letter of the last names of all the authors of the DeepMind paper. This naming method is also very interesting:
The exploration of mathematical problems never ends. As the author said, the DeepMind algorithm solution "still not the end of the story". However, their breakthrough this time is to stand on the shoulders of giants, that is, AI. The author said that the solution is to apply a series of transformations based on the DeepMind solution, thus eliminating one-step multiplication calculation.
1Many mathematical tasks in computer science are handled through matrix multiplication, such as machine learning, creation of computer graphics, various simulations or data compression. Computers calculate multiplication much slower than addition, so even a small improvement in the efficiency of matrix multiplication will have a huge impact. For decades, mathematicians have been looking for more efficient matrix multiplication algorithms.
In 1969, German mathematician Volker Strassen developed an algorithm that for the first time reduced the solution of 4×4 matrix multiplication from 64 steps to 49 steps, shocking the mathematics world.
The AI system AlphaTensor released by Deepmind this time discovered a new algorithm that is faster than the Strassen algorithm. Demis Hassabis said the new algorithm has the potential to increase efficiency by 10% to 20% in trillions of calculations per day.
AlphaTensor is a leap from games to mathematics, based on AlphaZero, a general-purpose board game AI system released by Deepmind in 2018. To train AlphaTensor, the Deepmind research team transformed a matrix multiplication problem into a 3D board game, with each step yielding the building blocks of a new algorithm. AlphaTensor is rewarded for generating new algorithms in as few steps as possible by choosing among tens of thousands of moves each time. Deepmind calls this a “tensor game.”
In a 5×5 input matrix, AlphaTensor independently discovered Strassen’s algorithm and other known algorithms. Also, it has developed new algorithms that are more efficient than the old ones.
For example, 5×5 matrix multiplication (n=4) previously required 80 calculation steps, but AlphaTensor’s new algorithm only requires 76 steps; when n=5, AlphaTensor will solve the problem from the original 98 steps reduced to 96 steps. The 4×4 matrix multiplication is reduced to 49 steps by Strassen and optimized to 47 steps by AlphaTensor. This efficiency is achieved by more than 70 algorithms for matrix multiplication generated by AlphaTensor.
Note: The algorithm complexity discovered by AlphaTensor is compared with known matrix multiplication algorithms
In addition, AlphaTensor can also develop hardware-specific algorithms , for machine learning. It is said to currently run 20% faster than algorithms on Google TPUs and NVIDIA V100s.
It is difficult for humans to independently adjust the multiplication algorithm to adapt to the hardware, so AlphaTensor’s improvement of the Strassen algorithm creates a new upper limit for 4×4 matrix multiplication, which is the advancement of AI. A great testament to the help provided by other disciplines. It also shows that the AlphaZero system, originally developed for traditional games, can solve mathematical problems outside the domain.
In the latest research by Manuel Kauers and Jakob Moosbauer, they mainly made two new discoveries, First, for the 4×4 matrix, they proposed another 47-step multiplication solution algorithm, but it was different from the previous solution; second, for the 5×5 matrix, they proposed a solution that required 95 multiplication steps for the first time.
In this article, the author briefly demonstrates the two matrix multiplication schemes. A formal paper will be published soon to introduce the search technology of the solution algorithm in more detail.
The new scheme for a 4 × 4 matrix contains a total of 47 multiplications, as follows:
#5 The 95-step multiplication scheme for a ×5 matrix (n=5) is as follows:
# Considering that the GPU performs trillions of matrix calculations every day, so from steps 98 to 96 and from 96 A seemingly small incremental improvement like step 95 can actually greatly improve computational efficiency and allow AI applications to run faster on existing hardware.
Introduction to the author:
Manuel Kauers, Professor of Algebra at Johannes Kepler University Linz, Head of the Institute of Algebra of the University people. His research interests are computer algebra, symbolic summation and integration, special function identities, etc.
Jakob Moosbauer is a PhD student at the Institute of Algebra, Johannes Kepler University Linz.
The above is the detailed content of Humans overtake AI: One week after DeepMind used AI to break the 50-year record for matrix multiplication calculation speed, mathematicians broke it again. For more information, please follow other related articles on the PHP Chinese website!