Federal safety regulators have launched a broad investigation of Tesla’s semi-autonomous Autopilot system as the number of crashes — some fatal — linked to the technology continues to grow.

The National Highway Traffic Safety Administration earlier launched a number of narrower Autopilot probes, starting with one that followed a 2016 Florida crash that killed a former Navy SEAL. The latest investigation will examine 11 crashes that occurred in the U.S. since January 2018, including ones where vehicles allegedly operating on Autopilot struck stationary police cars and fire trucks.
“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” NHTSA said in a document prepared by the Office of Defects Investigation. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”
The Autopilot system — which relies on a mix of sensor devices, microprocessors and software — was first introduced in mid-2014. The technology is designed to help keep vehicles like the Tesla Models S, X, 3 and Y in their lane, keep the vehicle safely spaced in traffic and avoid collisions.
Making promises you can’t keep
But critics contend the EV automaker has routinely overstated the technology’s capabilities. After the initial launch, Tesla CEO Elon Musk wrote a widely seen tweet showing himself and his former wife driving off in a Model S with their hands out the window.

The latest version of the system has been promoted by Tesla as “full self-driving,” even though the company website notes that it is “intended for use with a fully attentive driver who has their hands on the wheel and is ready to take over at any moment.”
But, like Musk, many owners have pushed well beyond what Tesla officially recommends. There have been numerous tweets, Instagram posts, Youtube and Tik-Tok videos showing drivers falling asleep behind the wheel, even climbing into the back seat while their vehicles are operating in Autopilot mode.
In the 2016 Florida crash, NHTSA and the National Transportation Safety Board both concluded that former SEAL Joshua Brown was watching a video device, rather than paying attention to the road, when his car struck a truck broadside. Regulators also determined that Autopilot failed to recognize the truck against the backdrop of a bright sky.
Emergency responders at risk
In at least some of the 11 crashes NHTSA is looking at, Tesla vehicles believed to have been operating in Autopilot struck stationary vehicles operated by emergency responders.

Radar and even camera sensors can have problems recognizing stationary obstacles, several experts told TheDetroitBureau.com. Such situations are even more complex when cones or emergency vehicles are involved, Duke University engineering professor Mary Cummings told the Wall Street Journal.
The problem is that each emergency situation presents a different visual target that systems like Autopilot may not have been trained to recognize. “This is why emergency situations are so problematic,” Cummings said. “The visual presentation is never the same.”
Tesla did not respond to a request for comment on the new NHTSA probe. But CEO Musk has frequently defended the Autopilot system, often claiming it is safer to use than relying on human drivers alone. He is also promising to bring out a fully hands-free version of the technology, though the rollout has repeatedly been delayed and now is not expected to occur until at least 2022.
Entire industry could face broader scrutiny
The auto industry, as a whole, is rushing to bring autonomous technology to public roads. General Motors’ San Francisco-based Cruise subsidiary is one of several firms cleared to begin testing fully driverless ride-hailing vehicles. Google spinoff Waymo has been one of several operating slightly less sophisticated technology still requiring a “backup operator” to remain behind the wheel ready to take control in an emergency.

A number of manufacturers, meanwhile, have started offering semi-autonomous systems, such as GM’s Super Cruise, capable of operating hands-free — with a driver at the ready — in carefully defined situations.
There have been numerous minor incidents involving these technologies, though only one fatality, when an Uber test vehicle struck and killed a pedestrian near Phoenix.
Critics contend Teslas have had an inordinate number of problems, worsened by the mixed message the company sends about Autopilot’s capabilities.
“Every other week”
“It seems like every other week we’re hearing about a new vehicle that crashed when it was on Autopilot” Senate Commerce Committee chair Maria Cantwell said during a June 2021 hearing.
NHTSA’s decision to launch the new probe was praised by various safety and consumer groups. The government agency’s “investigative actions are commendable,” said Cathy Chase, president of Advocates for Highway and Auto Safety. “They also highlight the need for minimum performance standards for vehicles with automated driving technology to protect first responders and all road users.”

But some critics feel NHTSA has taken longer than needed to put a spotlight on Tesla.
NHTSA also under fire
“Tesla has treated its customers like guinea pigs and deployed a faulty technology that can kill people with the false promise it is an Autopilot,” said Jamie Court, president of Consumer Watchdog. “We have long known that Tesla’s camera and radar mistakes signs, vehicles, and even the moon for other objects that are not there. It was dangerous to allow Tesla to deploy an unapproved technology and test it in real time on real people. Now it’s time for NHTSA to investigate how much Tesla knew about its defects and how dangerous it really is.”
The safety agency did not provide a timetable for completing the Tesla probe but it did say that it “will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.”
How NHTSA might respond to any faults it might discover remains to be seen.