A Tesla Model S sedan operating in Autopilot mode crashed into a parked and unoccupied police car Tuesday in Laguna Beach, California, the latest in a series of incidents linked to the carmaker’s semi-autonomous technology.
While at least two deaths have now occurred in crashes involving Autopilot, the driver of the vehicle in the latest incident suffered only minor injuries. But it is raising further concerns about both Autopilot and the way drivers are using – and possibly misusing – the system.
A day after the latest crash, California-based Consumer Watchdog called on the state’s regulator to investigate what it described as “dangerously misleading, deceptive marketing practices” which, the non-profit group contends, can lead Tesla owners to believe Autopilot is a fully self-driving technology.
Just a week ago, the group teamed up with Washington D.C.-based Center for Auto Safety asking for a similar review by the federal trade commission. “Tesla has repeatedly exaggerated the driverless capabilities of its Autopilot technology, putting profits ahead of its own customers’ safety,” declared CAS Executive Director Jason Levine.
In this week’s California accident, a Model S sedan slammed into a Laguna Beach police vehicle, causing extensive damage to the front of the Tesla, as well as the rear of the police cruiser, according to photographs posted by Sergeant Jim Cota. The driver subsequently told police investigating the incident that the battery-electric vehicle was operating in Autopilot mode at the time.
(Hypermiling Tesla Model 3 sets two records — only one by choice. Click Here for the story.)
The crash is the latest in a series of incidents that, in just recent months, have involved another Model S slamming into a parked fire truck in Utah, and a Model X SUV ramming a highway barrier in suburban Los Angeles. The Utah crash resulted in a broken foot for the driver. But the motorist in the L.A. incident was killed.
That marked the second known case in which a driver was killed while a Tesla vehicle was operating under Autopilot. In May 2016, former Navy SEAL Joshua Brown died after his Model S slammed into a truck that had pulled across his lane. The National Transportation Safety Board ultimately ruled both driver and vehicle shared blame, Brown for failing to remain vigilant and ready to take control in an emergency, and the Tesla Autopilot system for failing to distinguish the white truck from a bright Florida sky.
The NTSB is known to currently be investigating the more recent fatal crash – as well as another incident in Florida where two teens were killed in a Model S that hit a wall, its battery pack subsequently bursting into flames. Autopilot is not believed to have been involved in that crash.
Following the fatal Model X crash earlier this year, Tesla CEO Elon Musk issued a tweet putting blame on the driver, noting that warnings from the Autopilot system were repeatedly ignored in the moments leading up to the accident.
Referring to this week’s accident in Laguna Beach, Tesla said it has yet to confirm whether the vehicle was, in fact, operating in Autopilot mode. The company also stressed that it “has always been clear that Autopilot doesn’t make the car impervious to all accidents.”
(Click Here for more about how dealers are hampering EV sales.)
In its owner’s manual, Tesla stresses that drivers must remain vigilant and ready to take full control of their vehicle, whether or not an alarm sounds. Autopilot, it notes, “cannot detect all objects and may not brake/decelerate for stationary vehicles or objects especially when travelling over 50 mph (80 kph).”
Since the May 2016 crash that killed 30-year-old Brown, Tesla has been more active in explaining the limits of Autopilot’s capabilities. But after the technology was introduced in 2015 critics contend the automaker overstated the system’s abilities. CEO Musk himself was photographed driving, along with his ex-wife, both of his hands waving outside his Model S sedan.
That was followed by a flurry of social media posts, including one on Youtube in which a driver recorded himself setting his vehicle to Autopilot and then climbing into the back seat.
It appears at least some Tesla owners continue to believe they can turn control over to Autopilot. In the Utah incident, the driver enabled the system 82 seconds before the crash and, according to a police report, took her hands off the wheel “within two seconds.” She did not retake control before her sedan rammed into a parked fire truck.
Despite concerns about its capabilities and Tesla’s marketing efforts, CEO Musk has continued to enthuse about Autopilot, both online and through other venues. During a conference call with analysts and journalists scheduled to discuss Tesla’s weak first-quarter earnings, Musk even promised to have a long-anticipated, full hands-free version of Autopilot ready by sometime in 2019.
(To see more about safety groups wanting Tesla Autopilot name banned, Click Here.)
While there are plenty of enthusiasts, a recent study by AAA found that a growing number of American motorists say they are “afraid” of self-driving vehicles, even as pedestrians and bicyclists. The recent spate of crashes appear to be playing a role in those concerns, including a fatal incident near Phoenix involving a prototype Uber SUV that struck and killed a woman crossing the street. Evidence indicates that modified Volvo saw the pedestrian six seconds ahead of the crash but failed to respond.