Long before he became the new boss of Twitter, Musk was obsessed with making Tesla cars self-driving. The technology was quite expensive to develop, so when the supply chain began to collapse two years ago, Musk was determined to reduce costs. He targeted car radar sensors.
This sensor is designed to detect hazards at a distance and prevent the vehicle from hitting other cars while driving. There are now eight cameras mounted on the car, which can be used to see the road and spot hazards in every direction. Musk thinks that should be enough.
But according to multiple former employees, many Tesla engineers were shocked by this. They contacted a trusted former executive to try to persuade Musk to abandon this approach. Without radar, Tesla's vehicles would be prone to low-level perception errors when the camera is obscured by raindrops or even bright light, which could lead to a crash.
However, Musk did not seem convinced and he overruled the engineers' opinions. In May 2021, Tesla announced that it would remove radar from new cars. Soon after, the company began disabling radar in cars already on the road. Tesla cars that suddenly lose critical sensors are significantly more likely to crash and make other embarrassing mistakes, according to interviews with more than a dozen former employees, test drivers, safety officials and other experts.
Musk describes Tesla’s Full Self-Driving (FSD) technology as “the major difference between a Tesla being worth a lot of money or being worth essentially nothing,” but his self-driving car dream There were obviously obstacles.
Tesla has recalled and paused the rollout of the technology to eligible vehicles in recent weeks over concerns that its vehicles could violate speed limits and run through stop signs, according to U.S. officials. Customer complaints have piled up, including a lawsuit filed in court last month that said Musk exaggerated the technology's capabilities. Tesla's filings also show that regulators and government officials are scrutinizing Tesla's systems and its past statements as evidence of safety issues mounts.
In interviews, former employees who were involved in the development of Tesla’s driver-assist software attributed the company’s troubles to costs such as the speed of development and Musk’s decision to cancel the radar, which deviated from industry practice. cuts and other issues unique to Tesla. Additionally, Musk's erratic leadership style also played a role, forcing them to develop the technology at breakneck speed and push it to the public before it was ready. Some say even today they worry the software isn't safe enough for use on public roads.
John Bernal, a retired test operator who worked in Tesla's Autopilot department, said: "The system is progressing very slowly internally, but the public wants the company to release it as soon as possible." Bernal was fired in February 2022 when Tesla accused him of improperly using the technology after releasing a video of FSD.
Musk acquired the troubled social media platform Twitter last fall with great fanfare and mobilized dozens of Tesla engineers to help work on Twitter’s code, according to people familiar with the matter. Earlier this month, Tesla shares fell 6% after the company failed to announce major new products at its investor day.
Musk defended Tesla’s actions, saying it was a long-term bet that promised to unlock huge value. Tesla also said that vehicles with FSD software activated are at least five times less likely to be involved in a crash than vehicles driven normally. Musk and Tesla did not respond to repeated requests for comment.
But FSD’s story provides a vivid example of how the billionaire made it happen through rash decision-making, a stubborn insistence on doing things differently and an unwavering faith in an as-yet-unproven vision. One of the biggest bets gets complicated.
Patchwork solutions make it feel like technology is progressing
In April 2019, at a presentation called "Autonomous Investor Day", Musk made made perhaps his boldest prediction as Tesla CEO. He told investors at the time: "By the middle of next year, we will have more than 1 million Tesla vehicles equipped with fully autonomous driving hardware on the road. Our software will be automatically updated over the air, and FSD will be so reliable that drivers can even Sleeping in the car."
Investors were thrilled, and Tesla shares soared in 2020, making it the most valuable automaker and helping Musk become the world's richest man. After Autopilot, FDS was launched in 2014 and later allowed cars to drive autonomously on highways, steering, changing lanes and adjusting speed automatically. FSD aims to bring these features to city and residential streets, although this is a much more difficult task.
To achieve the above goals, automotive hardware and software need to be combined. Eight cameras are used to capture real-time footage of activity around the car, which allows the car to assess hazards such as pedestrians or cyclists and react accordingly. To deliver on his promise, Musk assembled a team of star engineers who were willing to work long hours and stay up late to solve problems. Musk is willing to test the latest software on his own cars and write "fix" requests for engineers with other executives.
Some ex-employees said the patchwork of solutions gave the illusion of continued technological progress but masked the lack of a coherent development strategy. While rivals such as Alphabet's self-driving car Waymo adopted strict testing protocols that limited the scope of its self-driving software, Tesla ultimately rolled out FSD to its 360,000 owners and left it to them to decide whether to activate it.
Tesla’s philosophy is simple: The more data the AI that guides the car is exposed to, the faster it learns. But this rough model also means that security is looser. Former Tesla employees say the company has chosen to let the software effectively learn on its own, developing brain-like agility through a rule-less technique called "neural networks." While this has the potential to speed up the training process, ultimately it is essentially a trial and error approach.
Competitors such as Waymo and Apple have adopted different autonomous approaches, setting the rules and addressing any violations if those restrictions are violated, according to Silicon Valley insiders familiar with the company's practices. Companies developing self-driving technology also often use sophisticated lidar and radar systems, which help software map their surroundings in detail.
Waymo spokesperson Julia Ilina said there are clear differences in the practices of the two companies. She said Waymo’s goal is to achieve complete autonomy and emphasize machine learning. Apple declined to comment.
Tesla’s approach has proven problematic many times. About two years ago, someone posted a video of the software struggling to navigate San Francisco's winding Lombard Street, and the video garnered tens of thousands of views. Bernal revealed that Tesla engineers built invisible barriers into the software, similar to bumpers in a bowling alley, to help the car stay on the road. A subsequent video shows the software running smoothly.
This made Bernal confused. As an in-house tester, it's part of his job to get behind the wheel on this stretch of road, and it's clear this is far from his typical experience on other public streets.
Radar originally played an important role in the design of Tesla vehicles and software, complementing cameras by providing a realistic view of the surrounding environment, especially in situations where vision may be obstructed. Tesla also uses ultrasonic sensors, which are short-range devices that can detect obstacles within a few centimeters around the car.
Even with radar, Tesla vehicles are not as sophisticated as other rival vehicles that use lidar. “One of the key advantages of lidar is that it can always spot a train or truck in advance, even if it’s No idea what that is. It knows there's something ahead and the vehicle can stop in time without knowing any more."
Cameras need to understand what they're seeing to be effective, relies on Tesla Workers label images recorded by vehicles, including stop signs and trains, to help the software know how to react.
Former Tesla employees said that at the end of 2020, Autopilot employees turned on their computers and discovered that workplace monitoring software had been installed within the company. The software monitors keystrokes and mouse clicks and tracks their image tags. If the mouse doesn't move for a period of time, a timer starts and the employee can be reprimanded until fired.
Last month, a group pushing for unionization at Tesla’s Buffalo plant raised concerns about workplace surveillance, and Tesla issued a response. The company said: "The reason for time monitoring of image tagging is to improve the ease of use of our tagging software. Its purpose is to calculate how long it takes to tag an image."
Musk once advocated A "visual only" navigation method because it's simpler, cheaper, and more intuitive. In February 2022, he wrote on Twitter: "Road systems are designed for cameras (eyes) and neural networks (brains)."
But many believe there are risks with this approach. A former Tesla Autopilot engineer said: "I just know that it is unsafe to use that software on the street. You can't predict what the car will do."
Remove Radar Leading to an increase in crashes
These former employees said the problems were noticed almost immediately after Tesla announced the removal of the radar in May 2021. During this time, the FSD testing program expanded from thousands to tens of thousands of drivers. Suddenly, Tesla vehicles were allegedly stopping for imagined dangers, misreading road signs and even failing to detect obstacles such as emergency vehicles, according to complaints filed with regulators.
Some people attribute the increase in "phantom braking" accidents in Tesla vehicles to a lack of radar. Data from the U.S. National Highway Traffic Safety Administration (NHTSA) shows that traffic accidents involving Tesla vehicles surged last year. Complaints about "phantom braking" have risen to 107 in the past three months, compared with 34 in the previous 22 months. NHTSA received about 250 complaints about the issue over a two-week period, and the agency launched an investigation after receiving 354 related complaints over a nine-month period.
Several months ago, NHTSA launched an investigation into Autopilot over about a dozen reports of Teslas crashing into stationary emergency vehicles. The latest example came to light this month, when the agency confirmed it was investigating a fatal crash in February involving a Tesla and a fire truck. Experts say radar can double-check what cameras see, since cameras are easily affected by bright light.
Missy Cummings, former NHTSA senior safety adviser, said: "This is not the only reason Tesla vehicles are in trouble, but it is an important reason. Radar can help detect The object in front. For computer vision with large errors, it can be used as a sensor to fuse it to check if there is a problem."
As the chief tester, Musk also requires frequent bug fixes for the software, Ask engineers to step in and adjust the code. One former executive recalled what an engineer who worked on the project told him: "No one can come up with a good idea when they're being chased by a tiger." An attitude of acceptance leads to a culture of conformity. Tesla fires employees who oppose Musk. The company has also pushed out so many software updates that in late 2021, NHTSA publicly warned Tesla not to release fixes without a formal recall notice.
Tesla and Twitter employees said Musk’s decision to acquire Twitter was a distraction. Many interviews with former employees and documents show that Musk asked dozens of Tesla engineers to help take over Twitter after the acquisition was completed last year. Software updates that were supposed to be released every two weeks are suddenly months apart as Tesla works to overcome bugs and chase more ambitious goals.
Some people lamented Musk's takeover of Twitter, saying he needed to refocus on Tesla to finish what he started. Tesla investor Ross Gerber said: "FSD bodes well for Tesla's bright future. We love Musk, he is an innovator of our time. We just want to see him come back wholeheartedly again To Tesla.”
Future full of uncertainty and facing multiple investigationsTesla engineers are exhausted and are resigning to look for opportunities elsewhere . Tesla AI director Andrej Karpathy took a month-long sabbatical last year and then chose to leave to join OpenAI, the company behind the chatbot ChatGPT. Meanwhile, Tesla Autopilot director Ashok Elluswamy has gone to work at Twitter.
As part of the ongoing investigation, the U.S. Department of Justice has requested documents related to FSD from Tesla. The U.S. Securities and Exchange Commission (SEC) is also looking into Musk's role in promoting Tesla's autonomous driving as part of a larger investigation.
In the lawsuit filed in February, Tesla was accused of making "false and misleading" statements that "significantly exaggerated" the safety and performance of Autopilot and FSD. That doesn’t include NHTSA’s two investigations into Autopilot, one into crashing emergency vehicles and another into “phantom braking.”
At this month’s Investor Day event, Musk appeared on stage with a dozen Tesla executives to tout the company’s extensive expertise. But the company didn't provide any major progress on FSD, despite having a section on the technology.
Many of Musk’s loyal customers have given up hope that his original promises will come true. Charles Cook, a commercial pilot and engineer from Jacksonville, Florida, owns a Model Y that he frequently drives with FSD activated.
While Cook was amazed by the technology’s capabilities, he was dissatisfied with its slow progress and the delay in delivering on Musk’s promises. He said: "Some people may have purchased the FSD software, thinking that they will now have a fully self-driving taxi, and then spent their hard-earned money on it. But now, Musk's engineers may be concerned about this Scoff. Some people probably spent $15,000 thinking they could have it next year and now they're disappointed." (小小)
The above is the detailed content of Tesla's self-driving plan flips between sloppiness and stubbornness. For more information, please follow other related articles on the PHP Chinese website!

arXiv论文“Insertion of real agents behaviors in CARLA autonomous driving simulator“,22年6月,西班牙。由于需要快速prototyping和广泛测试,仿真在自动驾驶中的作用变得越来越重要。基于物理的模拟具有多种优势和益处,成本合理,同时消除了prototyping、驾驶员和弱势道路使用者(VRU)的风险。然而,主要有两个局限性。首先,众所周知的现实差距是指现实和模拟之间的差异,阻碍模拟自主驾驶体验去实现有效的现实世界

特斯拉是一个典型的AI公司,过去一年训练了75000个神经网络,意味着每8分钟就要出一个新的模型,共有281个模型用到了特斯拉的车上。接下来我们分几个方面来解读特斯拉FSD的算法和模型进展。01 感知 Occupancy Network特斯拉今年在感知方面的一个重点技术是Occupancy Network (占据网络)。研究机器人技术的同学肯定对occupancy grid不会陌生,occupancy表示空间中每个3D体素(voxel)是否被占据,可以是0/1二元表示,也可以是[0, 1]之间的

当前主流的AI芯片主要分为三类,GPU、FPGA、ASIC。GPU、FPGA均是前期较为成熟的芯片架构,属于通用型芯片。ASIC属于为AI特定场景定制的芯片。行业内已经确认CPU不适用于AI计算,但是在AI应用领域也是必不可少。 GPU方案GPU与CPU的架构对比CPU遵循的是冯·诺依曼架构,其核心是存储程序/数据、串行顺序执行。因此CPU的架构中需要大量的空间去放置存储单元(Cache)和控制单元(Control),相比之下计算单元(ALU)只占据了很小的一部分,所以CPU在进行大规模并行计算

gPTP定义的五条报文中,Sync和Follow_UP为一组报文,周期发送,主要用来测量时钟偏差。 01 同步方案激光雷达与GPS时间同步主要有三种方案,即PPS+GPRMC、PTP、gPTPPPS+GPRMCGNSS输出两条信息,一条是时间周期为1s的同步脉冲信号PPS,脉冲宽度5ms~100ms;一条是通过标准串口输出GPRMC标准的时间同步报文。同步脉冲前沿时刻与GPRMC报文的发送在同一时刻,误差为ns级别,误差可以忽略。GPRMC是一条包含UTC时间(精确到秒),经纬度定位数据的标准格

2 月 16 日消息,特斯拉的新自动驾驶计算机,即硬件 4.0(HW4)已经泄露,该公司似乎已经在制造一些带有新系统的汽车。我们已经知道,特斯拉准备升级其自动驾驶硬件已有一段时间了。特斯拉此前向联邦通信委员会申请在其车辆上增加一个新的雷达,并称计划在 1 月份开始销售,新的雷达将意味着特斯拉计划更新其 Autopilot 和 FSD 的传感器套件。硬件变化对特斯拉车主来说是一种压力,因为该汽车制造商一直承诺,其自 2016 年以来制造的所有车辆都具备通过软件更新实现自动驾驶所需的所有硬件。事实证

arXiv论文“Trajectory-guided Control Prediction for End-to-end Autonomous Driving: A Simple yet Strong Baseline“, 2022年6月,上海AI实验室和上海交大。当前的端到端自主驾驶方法要么基于规划轨迹运行控制器,要么直接执行控制预测,这跨越了两个研究领域。鉴于二者之间潜在的互利,本文主动探索两个的结合,称为TCP (Trajectory-guided Control Prediction)。具

定位在自动驾驶中占据着不可替代的地位,而且未来有着可期的发展。目前自动驾驶中的定位都是依赖RTK配合高精地图,这给自动驾驶的落地增加了不少成本与难度。试想一下人类开车,并非需要知道自己的全局高精定位及周围的详细环境,有一条全局导航路径并配合车辆在该路径上的位置,也就足够了,而这里牵涉到的,便是SLAM领域的关键技术。什么是SLAMSLAM (Simultaneous Localization and Mapping),也称为CML (Concurrent Mapping and Localiza

什么是交通标志识别系统?汽车安全系统的交通标志识别系统,英文翻译为:Traffic Sign Recognition,简称TSR,是利用前置摄像头结合模式,可以识别常见的交通标志 《 限速、停车、掉头等)。这一功能会提醒驾驶员注意前面的交通标志,以便驾驶员遵守这些标志。TSR 功能降低了驾驶员不遵守停车标志等交通法规的可能,避免了违法左转或者无意的其他交通违法行为,从而提高了安全性。这些系统需要灵活的软件平台来增强探测算法,根据不同地区的交通标志来进行调整。交通标志识别原理交通标志识别又称为TS


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

SAP NetWeaver Server Adapter for Eclipse
Integrate Eclipse with SAP NetWeaver application server.

EditPlus Chinese cracked version
Small size, syntax highlighting, does not support code prompt function

Dreamweaver Mac version
Visual web development tools

Notepad++7.3.1
Easy-to-use and free code editor

VSCode Windows 64-bit Download
A free and powerful IDE editor launched by Microsoft
