Feds Will Investigate Deadly Tesla Crash in California
GARDENA, Calif. — The National Highway Traffic Safety Administration is investigating the crash of a speeding Tesla that killed two people in a Los Angeles suburb, the agency announced Tuesday.
Agency spokesman Sean Rushton wouldn’t say whether the Tesla Model S was on Autopilot when it crashed on Dec. 29 in Gardena. That system is designed to automatically change lanes and keep a safe distance from other vehicles.
The black Tesla had left a freeway and was moving at a high rate of speed when it ran a red light and slammed into a Honda Civic at an intersection, police said.
A man and woman in the Civic died at the scene.
A man and woman in the Tesla were hospitalized with non-life threatening injuries. No arrests were immediately made.
An NHTSA statement said the agency has assigned its special crash investigation team to inspect the car and the crash scene. That team has inspected a total of 13 crashes involving Tesla vehicles that the agency believed were operating on the Autopilot system. Results were published in two of those cases, one of which involved Autopilot. Results are pending in the other 10 cases, the agency said in a statement.
Messages were left Tuesday night seeking comment from Tesla.
Another Tesla crash killed a woman Sunday in Indiana. State police said the driver, Derrick N. Monet, 25, of Prescott Valley, Arizona, was seriously injured after he rear-ended a fire truck parked along Interstate 70 in Putnam County. His wife, Jenna N. Monet, 23, was pronounced dead at a hospital.
Derrick Monet told investigators he regularly uses his Tesla’s Autopilot mode, but didn’t recall whether he had it activated at the time of the accident, state police Sgt. Matt Ames said.
Earlier this month, a Tesla struck a police cruiser and a disabled vehicle in Connecticut but nobody was seriously hurt. The driver told state police that he was using the Autopilot system and had looked around to check on his dog in the back seat.
Both Tesla and the NHTSA have advised that advanced driver assist systems such as Autopilot aren’t entirely autonomous but require human drivers to pay attention at all times. But several crashes _ some fatal _ have been blamed on driver inattention linked to overconfidence in such systems. In one crash report, the National Transportation Safety Board referred to it as “automation complacency.”
The National Transportation Safety Board has criticized Tesla’s Autopilot. In September, that agency said that in a 2018 crash in Culver City where a Tesla hit a fire truck, the design of the Autopilot system “permitted the driver to disengage from the driving task.” Nobody was hurt in that accident.
The NTSB determined in September 2017 that design limitations of the Tesla Model S Autopilot played a major role in a May 2016 fatal crash in Florida involving a vehicle operating under Autopilot. But it blamed the crash on an inattentive Tesla driver’s over reliance on technology and a truck driver who made a left turn in front of the car.
- Changing the Focus of Claims, Data When Talking About Nuclear Verdicts
- Verisk: A Shift to More EVs on The Road Could Have Far-Reaching Impacts
- Allstate Thinking Outside the Cubicle With Flexible Workspaces
- T-Mobile’s Network Breached as Part of Chinese Hacking Operation