Tesla Recalls Nearly All Vehicles on US Roads over Lack of Autopilot Safeguards
WASHINGTON —Tesla is recalling just over 2 million vehicles in the United States fitted with its Autopilot advanced driver-assistance system to install new safeguards, after a federal safety regulator said the system posed safety concerns.
Tesla said in a recall filing that Autopilot’s software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash.
Acting NHTSA Administrator Ann Carlson at a U.S. House hearing on Wednesday praised Tesla for agreeing to the Autopilot recall. “One of the things we determined is that drivers are not always paying attention when that system is on,” she said.
Carlson added that when she kept hearing about fatal crashes involving the use of Autopilot, the agency opened a safety probe in August 2021. “My immediate response was, ‘We have to do something about this,'” she said.
Shares of the world’s most valuable automaker were down 3.4% at $228.97 on Wednesday afternoon.
Tesla’s Autopilot is intended to enable cars to steer, accelerate and brake automatically within their lane, while enhanced Autopilot can assist in changing lanes on highways but does not make them autonomous.
One component of Autopilot is Autosteer, which maintains a set speed or following distance and works to keep a vehicle in its driving lane.
Tesla said it did not agree with NHTSA’s analysis but would deploy an over-the-air software update that will “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged.”
The company did not respond to a question on whether the recall would be performed outside the United States or offer more precise details of the new safeguards. It is not immediately clear if China will demand a recall over the same issue.
A spokesperson for the Italian Transport Ministry had no knowledge of similar actions being taken in Italy. Regulators in Germany said they are looking into the issue.
NHTSA opened its August 2021 probe of Autopilot after identifying more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles and upgraded it in June 2022. NHTSA said it found “Tesla’s unique design of its Autopilot system can provide inadequate driver engagement and usage controls that can lead to foreseeable misuse of the system.” NHTSA reviewed 956 crashes where Autopilot was initially alleged to have been in use and focused on 322 Autopilot-involved crashes in its probe.
Bryant Walker Smith, a University of South Carolina law professor who studies transportation issues, said the software-only fix will be fairly limited. The recall “really seems to put so much responsibility on human drivers instead of a system that facilitates such misuse,” Smith said.
Separately, since 2016, NHTSA has opened more than three-dozen Tesla special crash investigations in cases where driver systems such as Autopilot were suspected of being used, with 23 crash deaths reported to date.
NHTSA said there may be an increased risk of a crash in situations when the system is engaged but the driver does not maintain responsibility for vehicle operation and is unprepared to intervene or fails to recognize when it is canceled or not.
NHTSA’s investigation into Autopilot will remain open as it monitors the efficacy of Tesla’s remedies.
The company will roll out the update to 2.03 million Model S, X, 3 and Y vehicles in the United States dating back to the 2012 model year, the agency said.
The update based on vehicle hardware will include increasing prominence of visual alerts on the user interface, simplifying engagement and disengagement of Autosteer and additional checks upon engaging Autosteer.
Tesla disclosed in October that the U.S. Justice Department had issued subpoenas related to its Full Self-Driving (FSD) and Autopilot systems. Reuters reported in October 2022 that Tesla was under criminal investigation over claims the company’s electric vehicles could drive themselves.
Tesla in February recalled 362,000 U.S. vehicles to update its FSD Beta software after NHTSA said the vehicles did not adequately adhere to traffic safety laws and could cause crashes.
NHTSA closed an earlier investigation into Autopilot in 2017 without taking any action. The National Transportation Safety Board (NTSB) has criticized Tesla for a lack of system safeguards for Autopilot, and NHTSA for a failure to ensure the safety of Autopilot.
Democratic U.S. Representative Jan Schakowsky said, “it’s past time to rein in Tesla’s hazardous advanced driving systems” and praised NHTSA for taking “action to protect all road users from misuse of these systems.”
- US High Court Declines Appeal, Upholds Coverage Ruling on Treated Wood
- T-Mobile’s Network Breached as Part of Chinese Hacking Operation
- Verisk: A Shift to More EVs on The Road Could Have Far-Reaching Impacts
- Fake Bear Attacks on Car for Fraudulent Insurance Claims Lead to Arrests