Tesla Crashes Highlight ‘Black Box’ Challenge for Investigations
Two collisions involving Tesla Inc. vehicles left three people dead on Sunday and police in Indiana and California are still trying to answer an unfamiliar question: who, or what, was in control at the time of the crash?
A Tesla Model S ran a red light shortly after midnight in Gardena, California and slammed into a Honda Civic, killing both passengers inside that car. Hours later, a Tesla Model 3 slammed into a fire truck parked across the left-hand lane of an Indiana highway, killing a woman and injuring her husband.
Police have yet to determine whether the drivers were using Tesla’s suite of automated driver-assist features known as Autopilot, which have drawn the interest of federal safety officials, or if they were manually driving their vehicles. They say they expect to learn that in the coming days.
The distinction will be important in determining the cause of the crashes and in answering questions regulators have posed about the industry’s rapid shift to vehicles that take over some — or all — of the functions of a driver.
Conventional vehicles have for years been outfitted with so-called event data recorders, the “black box” that logs critical information needed to piece together a crash sequence, such as braking, airbag deployment and other measurements that can be downloaded via widely-available tools.
Data detailing the performance of automated driving technologies in the seconds before a crash, however, are only accessible by vehicle manufacturers. That means investigators seeking to understand whether an automated system was active during a crash must turn to the company that manufactured the car.
“We should not ever have to rely on the manufacturer to translate this sort of data, because they are potentially liable for product defects and they have an inherent conflict of interest,” said Sam Abuelsamid, principal analyst for Navigant Research in Detroit.
He said a standard method should be in place to determine what control modes are active before and during a crash.
“As we deploy more vehicles with these partial automation systems, it should be mandatory to record information about the state of automation and the driver,” he said.
The National Transportation Safety Board after a 2016 Tesla crash called for the Transportation Department to define what data should be collected but it hasn’t completed that yet.
“As more manufacturers deploy automated systems on their vehicles, to improve system safety, it will be necessary to develop detailed information about how the active safety systems performed during, and how drivers responded to, a crash sequence,” the NTSB warned in a report on the crash. “Manufacturers, regulators, and crash investigators all need specific data in the event of a system malfunction or crash.”
Tesla did not respond to emails seeking comment. The company stresses that drivers are ultimately responsible for controlling their vehicle while using Autopilot, and that they must remain attentive with their hands on the wheel at all times. The company has also pushed back forcefully against criticism that the system is unsafe, often pointing to quarterly data released by the company that it says show drivers using Autopilot are safer than those operating without it.
The National Highway Traffic Safety Administration said it was reviewing the crash in California. An agency spokesman declined to comment on whether Autopilot was suspected in that crash.
Raul Arbelaez, vice president of the Insurance Institute for Highway Safety’s Vehicle Research Center, said the the inability to readily access that data impedes the ability of researchers to understand how automated driver-aids are performing in the field, especially in less-severe crashes that account for the vast majority of traffic collisions.
“How do people interact with these technologies? What conditions do they tend to work in? Do they work really poorly at night with snow and rain, or are they excellent under those conditions?” he said. “I’m sure it is very useful for the auto manufacturers to help them improve their products down the line, but in terms of understanding how the current fleet is performing, we really don’t have access to that in these crashes that are happening very quickly without working with the manufacturers.”
That was the case in 2016, when the NTSB and NHTSA probed the first fatal crash involving Tesla’s Autopilot system. In that collision, a 2016 Tesla Model S driver was killed after driving under a semi-trailer that was crossing a highway, shearing off the Tesla’s roof.
The vehicle involved in that Florida crash didn’t have a traditional event data recorder that could be read by widely available tools, the NTSB said. Instead, the Tesla vehicle collected substantial data that the company provided to the NTSB that revealed the car’s driver was using Autopilot at the time of the crash.
And even if the Tesla did have an event data recorder, the 15 data parameters required by the 2006 rule governing the devices “are inadequate to comprehend even the simplest questions of who/what controlled an automated vehicle at the time of a crash,” the NTSB wrote in the report detailing its investigation.
Since at least 2018, Tesla vehicles have had event data recorders and the company has offered tools to the public to download crash data.
The NTSB called on the U.S. Transportation Department to define what data parameters are needed to understand the automated vehicle control systems involved in a crash, which still needs to be addressed, according to an agency spokesman.
The NTSB is investigating other Tesla vehicles crashes that occurred while Autopilot was in use, including a March 2018 fatal collision in Mountain View, California. The NHTSA, meanwhile, has launched probes into 13 crashes that it believes may have occurred while drivers were using Autopilot, including a Dec. 7 crash in Connecticut in which a Tesla driver rear-ended a parked police cruiser.
The number of inquiries opened by the agency shows it is taking interest in how the new technology is being used in the field, said Frank Borris, a former director of the Office of Defects Investigation at NHTSA. Borris said the role of agency crash investigators is to gather real-world crash examples to better understand about areas of potential traffic safety risk, including new technologies.
Borris said he has long been concerned that Tesla dubbing its system “Autopilot” could lead drivers to rely on it too heavily and in scenarios for which it is not designed to operate. He also said little empirical data exists documenting real-world performance of Autopilot and other automated-driver assist technologies.
Meanwhile, the need for data by investigators is likely to grow more urgent as automated systems become more widespread, and advanced.
In April 2019, during an Autonomy Day for investors at the company’s Palo Alto, California headquarters, Tesla Chief Executive Elon Musk said that will deploy its “first operating robotaxis — with no one in them — next year.”
During the earnings call in October, Musk said they should be able to upload the software enabling a Tesla to become a robotaxis “by the end of next year.”
He clarified that “acceptance by regulatory authorities will vary by jurisdiction. But that transition, that sort of flipping the switch from a car that is from not robotaxi to robotaxi, I think, will probably be the biggest step change increase in asset value in history by far.”
–With assistance from Dana Hull.
- Report: Millions of Properties May be Underinsured Due to Multiple Undetected Structures
- Nearly 1,000 Feared Dead After Cyclone Hits France’s Mayotte
- Mississippi High Court Tells USAA to Pay up in Hurricane Katrina Bad-Faith Claim
- Sedgwick Eyes Trends and Risks in 2025 Forecast