Home > Article > Technology peripherals > Gary Marcus: Self-driving cars are a common problem but still haven't gained acceptance
In 2016, a New York Times article about self-driving cars began: “The era of self-driving cars has arrived, and some automakers have invested billions of dollars in research and development... and in the United States Some cities have started testing." After 7 years, what progress has been made in autonomous driving technology?
Gary Marcus, professor emeritus of psychology and neuroscience at New York University, offered some insights into this area. He believes that there are still some problems in this field, and this problem has been emphasized many times by Marcus in the past few years, namely edge cases, that is, those non-routine situations that often confuse machine learning algorithms
The more complex the situation faced by self-driving cars, the more unexpected anomalies there will be. And the real world is complex and chaotic, and we cannot list all the possible non-routine events that may occur. No one has yet figured out how to build a self-driving car that can cope with this fact.
Marcus said that the first time he emphasized the major challenges that edge situations bring to autonomous driving was in an interview in 2016, "At that time I got tired of the hype and eventually gave up on the idea. Re-reading the transcript now, I think it still applies today."
The technological progress we see now is largely driven by large-scale brute force technologies, such as the supercomputer Deep Blue and the Atari game system. The development of these technologies makes mankind extremely excited. At the same time, if you're talking about robots for homes or robots driving down the street, excitement isn't as high.
Generally speaking, autonomous driving performs well under normal circumstances, such as they can drive safely in clear weather. However, if they are placed in complex environments, such as snow, rain, etc., autonomous driving will become worse. Previously, American journalist and contributing editor Steven Levy once wrote an article about Google's autonomous driving, which mentioned that in 2015, Google achieved a major victory, that is, the system can automatically identify leaves
Identifying leaves is too simple for humans, but it is a major advance for self-driving cars. Humans can use common sense to reason and figure out what this thing might be and how it got there, but a self-driving system just remembers something and lacks reasoning, and that's the limitation that self-driving cars face... ...
People have been looking forward to more mature autonomous driving technology. Just a few days ago, the California Public Utilities Commission approved self-driving car companies Cruise and Waymo to operate 24/7 in San Francisco. The decision gives both companies more room for testing. After the news was announced, many people said that the era of self-driving cars has arrived, although it is later than expected
## In reality, we don’t have true self-driving cars yet. As noted American journalist Cade Metz explained a few months ago on my podcast "Humans vs. Machines," every self-driving vehicle on public roads will either have a human safety driver or some. Remote supervision to help the vehicle out of trouble
Now, new edge cases are emerging in autonomous driving, such as a Tesla crashing into a parked jet airplane.
Marcus said no matter how much data these systems are trained on, new situations will always emerge
There was a recent incident where ten self-driving cars lost contact with mission control. Without the supervision of a control center, these self-driving cars have experienced many accidents such as getting lost and parked in the middle of the street
The development of the field of self-driving has been is constantly changing, so many researchers, including Marcus, are confused by the California Public Utilities Commission's approach
It would be irrational to test autonomous driving anywhere and at any time without rigorous, carefully vetted solutions to handle edge cases. This applies not only to self-driving cars but also to other machine learning-based fields
Edge cases are everywhere and anyone who thinks this is all easy to solve is deceiving Own.
We need to strengthen management, if we don’t we may see major accidents with driverless cars, automated doctors, universal virtual assistants, home robots, etc. in the next few years .
At the end of the article, Marcus stated that he completed this article on an aircraft equipped with an autopilot. During the 9-hour flight, the autopilot was working all the time. , humans also participate during this period, which constitutes a human-in-the-loop. Ultimately, Marcus doesn’t think there will be autonomous planes, and he doesn’t think any quasi-autonomous cars have been approved yet.
The above is the detailed content of Gary Marcus: Self-driving cars are a common problem but still haven't gained acceptance. For more information, please follow other related articles on the PHP Chinese website!