Federal regulators are asking for Tesla to provide data and documents related to its Autopilot program as a prominent magazine with a mixed history with the automaker issues a call for the company to disable Autopilot on all models until further notice.
The moves come on the heels of Elon Musk, the founder and CEO of EV maker, announcing the company does not plan to shut down the Autopilot feature on other vehicles after the death of an Ohio man using the feature.
Consumer Reports, which at one point gave the Model S its highest rating ever, essentially demanded California-based Tesla disable the Autopilot feature on its cars until it can reprogram it to require that drivers keep one hand on the wheel every five minutes. The publication also criticized Tesla for naming the system Autopilot, saying the name is misleading.
“By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” said Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports. “In the long run, advanced active safety technologies in vehicles could make our roads safer.
“But today, we’re deeply concerned that consumers are being sold a pile of promises about unproven technology. ‘Autopilot’ can’t actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are on the wheel.”
Tesla remains steadfast in its commitment to leaving Autopilot in place for the time being.
“Tesla is constantly introducing enhancements proven over millions of miles of internal testing to ensure that drivers supported by Autopilot remain safer than those operating without assistance,” a Tesla spokesperson told TheDetroitBureau.com in an email.
“We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media.”
(CEO Musk: Tesla won’t disable Autopilot despite crashes. Click Here for more.)
However, not all media outlets have been so quick to take Tesla to task for its stance and criticize others, like Consumer Reports, for blaming the technology. They remind that Autopilot is a beta program and as such is not perfect.
It should also be noted that Tesla provides plenty of warnings about the fallibility of Autopilot, encouraging drivers to ensure they are paying attention when using the program for this reason. Since its introduction, Tesla’s issued a standard warning regarding Autopilot.
“Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car. This is enforced with onboard monitoring and alerts. To further ensure drivers remain aware of what the car does and does not see, Tesla Autopilot also provides intuitive access to the information the car is using to inform its actions.”
Ultimately, media “on Tesla’s side” all point out that Joshua Brown, the man killed in the accident, knew better than most about the capabilities and liabilities of the system. He’d recently had a close call using the system and was often described as a “Teslavangelist.”
Consumer Reports is particularly critical of Tesla branding it as “Autopilot.” However, many are quick to note that jets have autopilot, but still require pilots to be at the ready, as to boats with autohelm and other similar systems. In short, it’s not fair to suggest that name of the program led to Brown’s death.
(Click Here for details about Tesla cutting the price of the Model X, but ending its buyback guarantee.)
“Guess what? A Tesla with Autopilot isn’t a Self-Driving Car,” writes Alex Roy in a piece for TheDrive.com. “It operates at what’s called Level 2 Autonomy, and Brown—an ex-Navy SEAL, tech executive and self-proclaimed Tesla evangelist—must have known this better than anyone.
“An ex-Navy SEAL would be more likely to understand the need for a human in the loop, especially in a Beta release. To suggest Brown was a victim of aggressive marketing is to insult a man better equipped to understand such technology than 95% of Tesla owners and 99% of journalists writing about the crash.
“Ex-Navy SEALS aren’t known for shirking personal responsibility. He would have understood that the driver remains legally responsible at all times, and yet all evidence suggests he became overconfident in the system, and paid the final price for it.”
NHTSA’s investigations unit opened a “Preliminary Evaluation” to examine the performance of the Model S Automatic Emergency Braking system and any other forward crash mitigation or forward crash avoidance systems enabled and in use at the time of Smith’s crash.
However, the agency has asked for even more information as part of its query, including a list of all of Tesla’s Autopilot capable vehicles including the VIN, model, model year, total mileage with Autosteer on, total number of “Hands on Wheel” Autosteer warnings records, etc.
(Tesla set to reveal “Top Secret Masterplan,” says Musk. Click Here for details.)
They also asked Tesla to: provide a list of all consumer complaints, field reports, reports involving a crash, property damage claims and lawsuits; describe all past, present or future analyses and studies that pertain to or are possibly related to the alleged defect; and to describe any and all modifications to Tesla’s vehicles that may or may not relate to the alleged defect.