(This story has been updated with additional comments by Tesla to a query by TheDetroitBureau.com)
Federal safety regulators are investigating the death of a 40-year-old man killed in the crash of his Tesla Model S while the battery-electric vehicle was operating in semi-autonomous Autopilot mode.
Word of the May 7 crash in Williston, Florida was just released as the National Highway Traffic Safety Administration Said it would begin a preliminary investigation into the crash, which occurred when the system failed to prevent a collision with a tractor-trailer that turned in front of the luxury car.
Tesla issued a statement emphasizing that this was the first known fatal crash first involving the Autopilot system, which lets motorists operate hands-free on limited-access highways. There have been a number of more minor crashes involving other autonomous vehicle prototypes, including nearly 20 reported by Google, a leader in the field.
“This is the first known fatality in just over 130 million miles where Autopilot was activated,” a statement on the Tesla website said. “Among all vehicles in the U.S., there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.”
(New study says three of four American motorists want autonomous vehicles. Click Here for more.)
The automaker’s response has generated some criticism on the Internet among those who felt it was a little insensitive to the situation and more focused on defending Tesla’s technology.
Since releasing the Autopilot system last year for both the Model S battery sedan and Model X SUV, Tesla has issued at least one update intended to limit what drivers can do with the technology. It is intended to permit a motorist to take their hands off the wheel but still stay alert in case they are required to intervene in a situation. In videos posted online after Autopilot’s release, at least one driver was seen climbing into the back seat.
The Tesla post provided preliminary details of the crash, noting that, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.”
NHTSA briefly launched an investigation into a reported problem with the rear axle of another Tesla last month but quickly absolved the maker when it received more details about the problem. In announcing the new probe, the agency said it “should not be construed as a finding that (NHTSA) “believes there is either a presence of an absence of a defect in the subject vehicles.”
It is not yet clear how long the investigation will take.
In its website statement Tesla pointed out that the technology is still in a “public beta phase.”
“When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot is an assist feature that requires you to keep your hands on the steering wheel at all times, and that you need to maintain control and responsibility for your vehicle while using it.”
Nonetheless, the system has generated a mix of praise and caution since its release last year. Skeptics have made comparisons to the often glitch software initially released for smartphones and computers that deliver unexpected surprises until they undergo several updates.
While a number of manufacturers have taken to public roads with both semi- and fully autonomous prototypes, they generally require trained operators to be behind the wheel and at the ready to take over at a moment’s notice.
Even then, Google has experienced a number of minor crashes with its test vehicles. The latest publicly announced was also the first blamed on the prototype. It failed to yield to a bus while attempting to merge into traffic. There were no injuries.
Many experts believe autonomous vehicles will become commonplace in the next decade, Nissan, for one, planning to have as many as 10 different models in production by 2020. And a growing number of automakers are adding advanced semi-autonomous systems to their 2017 models.
Mecedes-Benz, for example, is billing its new ’17 E-Class sedan as “the most intelligent vehicle in the world,” with the ability to pass slower vehicles on limited-access highways without the driver’s hands on the wheel. The automaker said it has intentionally limited hands-free operation to no more than about 30 seconds at a time, in part, due to concerns about regulatory guidelines.
And it has not activated other features “because we feel we may be pushing (too far),” said Bart Herring, a senior product manager for Mercedes-Benz USA. He described the situation as a “gray zone.”
(Automakers enter “gray zone” as they roll out semi-autonomous technologies. For more, Click Here.)
NHTSA Administrator Mark Rosekind is an open proponent of the long-term safety potential of autonomous vehicles, as is Department of Transportation Sec. Anthony Fox. The government is in the midst of creating rules governing the roll-out and operation of hands-free vehicles that it hopes to issue before the end of this year.
How the fatal crash of the Tesla Model S might affect that process is uncertain.
The driver, 40-year-old Joshua D. Brown, of Canton, Ohio had previously credited the Autopilot system with preventing a crash on an interstate. The former Navy SEAL was a big fan of the vehicle that he had nicknamed “Tessy.”
The driver who was operating the truck involved in the crash, 62-year-old Frank Baressi, told the Associated Press in an interview that Brown was “playing Harry Potter on the TV screen” when the crash occurred. Tesla told the wire service the vehicle did not allow a movie to be shown on the oversized front screen that operates most of the vehicle functions in its Model S.
“Today’s Autopilot features are designed to provide a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable. Autopilot is by far the most advanced such system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” Tesla said in response to a request for comment by TheDetroitBureau.com.
(New study sees 21 million autonomous vehicles on the road by 2035. Click Here for the latest.)
2 responses to “Tesla Driver Dies While Car Operating on AutoPilot”
My questions are:
How did the truck driver know Mr. Brown was playing Harry Potter on the screen?
What are the actual details of the accident, was the truck driver changing lanes or did he make a left-handed turn from the right lane?
Why is the Auto-Pilot system being deployed when it’s still in beta-form? It is not like the next version of MS Office, a video game; it’s a friggin’ auto-pilot system for a vehicle!!!
Why are they saying the person needs to be alert and keep their hands on the wheel when it is in auto-pilot? Beta system or not, don’t they know people are going to take auto-pilot literally; the reason they will use auto-pilot is to not have hands on the wheel, not pay attention 100% as they are supposed to when driving.
If the auto-pilot can not sense a vehicle cutting in front of, coming too closely to the side of, coming too close to the front it’s not even ready for beta testing in closed settings, let alone on public roads – do they not know this?
Maybe the movie was still playing on the PDP after the crash it’s probably a long movie anyway ☺
Maybe he can get a payday out of Tesla who knows?