Tesla, which has been selling its controversial Full-Self Driving software upgrade for thousands of dollars for years, has issued a recall for all of the nearly 363,000 vehicles that use the feature. This decision comes after a US government agency warned that the software could endanger drivers in rare situations and increase the risk of crashes in everyday scenarios. While recalls in the auto industry typically focus on specific parts or road situations, this recall from Tesla is more comprehensive.
The National Highway Traffic Safety Administration has stated that the Full Self-Driving software may violate local traffic laws and behave in unexpected ways in a variety of driving scenarios.
The agency's filing lists several situations, including running a yellow light about to turn red, failing to come to a complete stop at a stop sign, speeding due to an inability to detect road signs, or because the driver has set the car to a faster speed by default, and making unexpected lane changes to move out of turn-only lanes while driving straight through an intersection. While a software patch is being developed to fix these issues, drivers will still be able to use the feature.
These scenarios, which are the subject of the recall, seem to be linked to a design flaw that some safety experts believe has been a fundamental issue with Tesla's driver assistance technology for some time. This flaw involves the idea that drivers can rely on the software to drive the car, but must be ready to take over at a moment's notice when the software needs assistance.
According to Philip Koopman, an associate professor at Carnegie Mellon University who specializes in self-driving car safety, humans don't function in this way. "This technology has a fundamental flaw," he explains. "You have a short reaction time to avoid these situations, and people aren't good at that if they're trained to think that the car does the right thing." The car is designed to alert the driver with buzzing and beeping when it senses that human intervention is required.
Koopman notes that today's recall indicates that the US government is taking small steps towards setting stricter boundaries for not just Tesla's ambitious technology, but also for advanced driver assistance features from all automakers. These features are intended to make driving more enjoyable, less monotonous, and safer, but they also necessitate car manufacturers to make difficult judgments about the limits of human attention and how to promote and describe their technology's capabilities.
Tesla has taken a distinct approach, led by its CEO Elon Musk, of avoiding government oversight, rebuking lawmakers, and in certain instances, developing technology quicker than regulators can oversee. According to a statement provided by Lucia Sanchez, an NHTSA spokesperson, the agency identified the concerns that prompted the recent recall through investigations linked to a probe initiated in 2022. The inquiry looked into why cars using Tesla's Autopilot feature have a history of crashing into stationary emergency vehicles.
Tesla informed the agency this week that customers had submitted warranty claims matching the situations highlighted by NHTSA at least 18 times between spring 2019 and fall 2022. The company stated that it was not aware of any injuries or fatalities linked to the defects identified by the agency, according to the filing.
While Tesla did not concur with the agency's analyses, it agreed to proceed with the recall regardless, according to the NHTSA filing. The software defects will be corrected through an over-the-air update soon, meaning that drivers will not have to take their vehicles in for servicing. Tesla did not reply to a request for comment, and it is unclear what changes the automaker will make to its full self-driving feature. However, Elon Musk, the CEO of Tesla, SpaceX, and Twitter, tweeted that using the term "recall" to refer to the update "is anachronistic and just flat wrong!"