The National Transportation Safety Board’s finding Wednesday that the design of Tesla Inc.’s Autopilot “permitted the driver to disengage” from driving the vehicle enflamed critics of the technology, who renewed their call for its ban.
“NTSB has done its job by thoroughly investigating this technology and this crash,” the Center for Auto Safety said in a statement. “The National Highway Traffic Safety Administration must also now do its job and recall these vehicles.”
The safety group has repeatedly called for NHTSA to ban the technology and force Tesla to recall the vehicles to do so. The CAS is pushing on several fronts to get Autopilot banned.
“Last month, the Center for Auto Safety again highlighted for the Federal Trade Commission, Tesla’s deceptive use of the term Autopilot which encourages exactly the sort of overreliance seen in this crash,” the group noted.
“Autopilot has already resulted in avoidable deaths, injuries, and crashes, yet NTSB’s previous recommendations to NHTSA were met with silence, and the FTC has yet to act.”
The sedan had been following another vehicle for a period of time, but swerved – presumably to avoid the fire truck on the side of I-405 – about 20 seconds before Model S hit the truck. Instead of swerving, it sped up. There were no injuries due to the crash, the NTSB safety board said Tuesday.
While the CAS focused on Autopilot’s impact on the incident, the report offers a more even-handed analysis of events.
The NTSB determined the probable cause for the crash was the Tesla driver’s lack of response to the fire truck parked in his lane, due to his inattention and overreliance on the car’s advanced driver assistance system; the Tesla’s “Autopilot” design which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from Tesla.
That’s not to suggest the NTSB hasn’t been critical of the technology before. It previously criticized Tesla’s Autopilot after a 2016 fatal crash in Florida. The NTSB noted in its investigation that Autopilot allowed the driver to keep his hands off the wheel for more than 13 minutes of the 14-minute trip.
The collision is one of several the NTSB is investigating involving the California-based EV maker’s Autopilot system, which is a semi-autonomous driving technology. In this instance, the 2014 Model S hit the truck in Culver City, California, at about 30 mph.
“I was having a coffee and a bagel. And all I remember, that truck, and then I just saw the boom in my face and that was it,” the Tesla driver said in the NTSB report.
He added that he didn’t remember exactly what he was doing at the time of the crash. He suggested he could have changing the radio station or drinking coffee. He was not texting or talking on his cellphone, according to the agency.
That makes no difference to the Center for Auto Safety.
“Put simply, a vehicle that enables a driver to not pay attention, or fall asleep, while accelerating into a parked fire truck is defective and dangerous. Any company that encourages such behavior should be held responsible, and any agency that fails to act bears equal responsibility for the next fatal incident,” the CAS said.