The National Transportation Safety Board determined that a Tesla Model S that hit the back of a fire truck was in Autopilot mode and the driver was not holding the steering wheel when it hit the parked vehicle.
The collision is one of several the NTSB is investigating involving the California-based EV maker’s Autopilot system, which is a semi-autonomous driving technology. In this instance, the 2014 Model S hit the truck in Culver City, California, at about 30 mph.
The board plans to release a report on the cause of the crash on Wednesday, according to Reuters.
According to the NTSB, the sedan engaged Autopilot for the last 13:48 of its trip. The driver’s hands did not touch the steering wheel for all but 51 seconds of the trip, ignoring repeated warning from the vehicle to for the driver to place his hands back on the wheel.
The sedan had been following another vehicle for a period of time, but swerved – presumably to avoid the fire truck on the side of I-405 – about 20 seconds before Model S hit the truck. Instead of swerving, it sped up. There were no injuries due to the crash, the NTSB safety board said Tuesday.
“I was having a coffee and a bagel. And all I remember, that truck, and then I just saw the boom in my face and that was it,” the Tesla driver said in the NTSB report.
He added that he didn’t remember exactly what he was doing at the time of the crash. He suggested he could have changing the radio station or drinking coffee. He was not texting or talking on his cellphone, according to the agency.
Tesla has been subject to scrutiny since the implementation of the Autopilot system and CEO and Founder Elon Musk’s insistence that the company’s vehicles will be full self-driving by the end of this year. It’s a contention he’s made repeatedly since it was introduced as a semi-autonomous driving aid several years ago. Musk has repeatedly pushed back the system’s “start by” date.
The name has been cause for concern for many safety advocacy groups that has pushed for the company to rename it or suspend its use altogether.
Two of the most vocal critics of Tesla Inc.’s Autopilot feature renewed their calls in June for federal and state investigations of the EV maker, claiming the company’s engaging in “dangerously misleading and deceptive practices.”
Consumer Watchdog and the Center for Auto Safety once again accused the California-based electric vehicle maker and CEO Elon Musk of putting lives in peril by making claims that the Autopilot technology available on all three models Tesla offers is a fully functional autonomous technology.
Musk has repeatedly offered that the company makes it clear that Autopilot is not currently considered fully autonomous; however, he has said that he expects that it will be by the end of this year. In the company’s most recent conference call, he noted that most Tesla models on the road are capable of being fully autonomous.
However, the two groups have called for the Federal Trade Commission and the California Department of Motor Vehicles to begin investigations immediately. They contend that Tesla violated Section 5 of the FTC Act, as well as California law, because the EV maker’s descriptions and assertions about Autopilot are “materially deceptive and are likely to mislead consumers into reasonably believing that their vehicles have self-driving or autonomous capabilities.”
Tesla’s Autopilot was engaged during at least three fatal U.S. crashes, including fatal March 2018 crashes of a 2018 Model 3 in Delray Beach, Florida and in Mountain View, California of a Model X. Both of those crashes remain under investigation by the NTSB and National Highway Traffic Safety Administration. The agencies are also investigating other crashes and battery fires involving Tesla vehicles, Reuters reported.