Home > Article > Technology peripherals > Sony AI defeats top human racers, beating humans by 1.5 seconds
"What's going on?" Emily Jones couldn't believe she was falling behind.
Emily Jones is a top GT game racer who has won multiple championships. She slapped her e-sports special steering wheel and stared at the screen in front of her: "I tried my best, but I still can't catch up. Get on it - how does it do that?"
In the game Gran Turismo, Jones drives her car at 120 miles per hour. In order to keep up with the fastest "players" in the world, she drove at speeds of 140 and 150 miles per hour.
This "player" is actually an artificial intelligence named GT Sophy. Released by Sony's Artificial Intelligence Research Laboratory in 2020, it uses artificial intelligence technology to learn how to control cars in the GT game. Sony pitted the AI against top GT racers at a series of closed-door events in 2021.
In July 2021, Jones participated in an event organized by Sony as a member of the e-sports team Trans Tasman Racing, but at the time she did not know what to expect.
"No one gave me any information. They just told me that I didn't need to do any practice and don't worry about lap times," she recalled. "My attitude is also very simple. Just keep it a secret. This is definitely not a bad thing."
In the end, GT Sophy defeated Jones' best result by 1.5 seconds - a human racer broke the GT record They are basically measured in milliseconds, and 1.5 seconds means a huge difference.
But Sony quickly learned that speed alone wasn’t enough to make the GT Sophy a winner. It outperformed human drivers on a single track, breaking records on three different tracks with incredible results.
But when Sony pitted it against multiple human drivers, it lost—multiplayer racing requires not only speed, but also a certain amount of intelligence. GT Sophy sometimes incurred penalties for being too aggressive and reckless, and other times was too timid, giving in when it wasn't needed.
Sony retrained the AI and conducted a second round in October 2021. This time, GT Sophy easily beat the human players. What changes has it made?
First of all, Sony has built a larger neural network, and the performance of the program is more powerful, but the essential difference is that GT Sophy has learned "track etiquette."
American head Peter Warman () said that this etiquette is widely observed by human drivers. Its essence is the ability to balance aggression and concessions, and dynamically choose the most appropriate behavior in an ever-changing arena.
This is also what makes GT Sophy surpass the artificial intelligence of racing games. Drivers' on-track interactions and etiquette are a special example of dynamic, context-aware behavior that robots should have when interacting with people, he said.
Knowing when to take risks and when to play it safe will be useful for artificial intelligence, whether on the manufacturing floor, in home robots, or in self-driving cars.
said: "I don't think we have learned the general principles of how to deal with the human norms that have to be followed. But this is a good start and hopefully it will give us a deeper understanding of the problem."
GT Sophy is just one of many AI systems that have beaten humans, from chess to StarCraft and DOTA2, AI has beaten the best human players in the world. But GT provides Sony with a new challenge.
Unlike other games, especially those turn-based games, GT requires top players to control vehicles in real time while approaching the limits of physics (ultra-high speeds). In a competition, all other players are doing the same thing.
The virtual race car zooms by at 100 miles per hour, just inches from the edge of the curve. At these speeds, small errors can lead to collisions.
It is reported that GT games are famous for capturing and replicating real-world physics in detail. It simulates the aerodynamics of the car and the friction of the tires on the track. The game is sometimes even used to train and recruit real-world racers.
Davide Scaramuzza, head of the Robotics and Perception Group at the University of Zurich in Switzerland, said: "It does a good job in terms of realism." He is not involved in the GT Sophy project, but his team has used GT games To train artificial intelligence drivers, it has not yet been tested on humans.
GT Sophy participates in the game differently than human players. Instead of reading pixels on the screen, it gets data about its own position on the track and the positions of surrounding cars. It also receives information about the virtual physical forces affecting its vehicle.
In response, GT Sophy controls the car to turn or brake. This interaction between GT Sophy and the game took place 10 times per second, which its colleagues claimed was similar to the reaction time of a human player.
Sony used reinforcement learning to train GT Sophy from scratch through a trial and error method. At first, the AI could only try to keep the car on the road.
But after training on 10 PS4s (each running 20 instances of the program), GT Sophy improved to the level of GT’s built-in artificial intelligence, equivalent to that of an amateur player, in about 8 hours. Within 24 hours, it was near the top of a leaderboard of 17,700 human players' best results.
GT Sophy spent 9 days knocking down lap times. In the end, it was faster than any human player.
It can be said that Sony's artificial intelligence has learned how to drive at the limit allowed by the game and completed actions that are beyond the capabilities of human players. What impressed Jones most was the way the GT Sophy turned, braking earlier to accelerate out of corners with tighter routing.
She said: "GT Sophy has a weird way of treating the line and does things I never even thought about." For example, GT Sophy often sends a tire to the grass at the edge of the track. up, then slide into the corner. Most people don't do that because it's too easy to make a mistake. It's like you're controlling a crash. Give me a hundred chances, I may only succeed once. ”
GT Sophy quickly grasped the physics of the game, but the bigger problem was the refereeing. In the professional arena, GT races are overseen by human referees, who have the power to deduct points for dangerous driving.
Accumulated penalties were a key reason why GT Sophy lost the first round in July 2021, despite being faster than any human driver. At the second round a few months later, It learned how to avoid penalty points, and the results made a world of difference.
Have invested several years in GT Sophy. On the wall behind his desk hangs a picture of two cars jostling for position. Draw. "This is GT Sophy taking over Yamanaka," he said.
He was referring to top GT Japanese driver Tomoaki Yamanaka, one of four Japanese pro sim racers racing against GT Sophy in 2021 .
He can’t remember which game this picture is from. If it is a game in October 2021, Yamanaka will probably enjoy it because he is facing a strong but fair opponent. If This is the event in July 2022, and he will probably curse the computer for being confused.
Yamanaka's teammate Takuma Miyazono briefly described the event in July 2022 to us through translation software. He said: "There were a few times We got knocked off the track by the (GT Sophy) because it was cornering too aggressively. This pissed us off because human players would slow down during turns to avoid going off the track. "
said it is very difficult to train artificial intelligence to play fair without losing its competitive advantage. Human referees make subjective calls that depend on the environment, making it difficult to translate them into something that artificial intelligence can learn. things, such as which behaviors can and cannot be done.
Sony researchers try to provide artificial intelligence with many different clues for it to call and adjust, hoping to find an effective combination. If If it veers off the track or hits the fence, causing a vehicle collision or possibly being penalized by the referee, then it will be punished.
They experimented, observed and debugged the intensity of each penalty, and checked How the way GT Sophy drives will change.
Sony has also increased the competition that GT Sophy faces in training. Before this, it mainly trained against older versions of itself.
Before the rematch in October 2021, Sony will invite top GT drivers every week or two to help test the artificial intelligence, and then comprehensively adjust the results.
"This gives us the feedback we need to Finding the right balance between aggression and concession," said.
It worked. Three months later, when Miyazono raced against the GT Sophy, the latter's aggressive performance was gone - but it It’s not about simply holding back. “When two cars enter a corner side by side, the GT Sophy leaves enough room for the human driver to pass,” he said. “It makes you feel like you’re racing against another real person. "
He added: "Faced with this kind of reaction, the riders get a different kind of passion and fun. This really impressed me. "
was deeply impressed by Sony's work. He said: "We use human capabilities to measure the progress of robotics technology. But, his colleague Elia Kaufman points out, it was still the human researchers who dominated the training of GT Sophy's learned behavior.
"Good track etiquette is taught by humans." Artificial intelligence," he said. "It would be really interesting if this could be done in an automated way. "Such a machine would not only have good track manners, but more importantly it would be able to understand what track manners are and be able to change its behavior to adapt to new settings.
's team is now working on its GT racing , applied to real-world drone competitions, using raw video input instead of simulated data to train artificial intelligence to fly. In June 2022, they invited two world championship-level drone pilots to compete against the computer.
He said: “Their faces after seeing our AI play said it all. They were shocked. ”
He believes that real progress in robotics must extend to the real world. "There's always going to be a mismatch between simulation and the real world," he said. "That's something that gets forgotten when people talk about the incredible advances in artificial intelligence. On the strategic side, Yes. But in terms of deployment into the real world, we're still very far away."
For now, Sony is still insisting on using the technology only in games. It plans to use GT Sophy in future versions of GT games. "We want this to be part of the product," said Peter Stone, executive director of the company. "Sony is an entertainment company and we hope this makes the game more fun." Jones believes the entire sim racing community can learn a lot once people get a chance to watch GT Sophy behind the wheel. "On many tracks, we will find that many of the driving techniques that have been used for many years are flawed, and there are actually faster methods."
Miyazono is already trying to replicate the routing of artificial intelligence in corners way, as it has been shown that this can be achieved. "If the baseline changes, then everyone's skills improve," Jones said.
The above is the detailed content of Sony AI defeats top human racers, beating humans by 1.5 seconds. For more information, please follow other related articles on the PHP Chinese website!