Tesla Inc.’s list of problems just got a little longer: another Autopilot-related crash. A Model 3 reportedly using the semi-autonomous technology hit a police car parked on the side of the road while engaged in a traffic stop.
The National Highway Traffic Safety Administration recently opened an investigation of 11 incidents in which Tesla vehicles allegedly with the Autopilot engaged crashed into emergency response vehicles parked on the side of the road. These events happened while the responders were performing their duties.
The latest incident occurred in Florida where a Model 3 using the technology collided with a Florida Highway Patrol car, according to the Associated Press. The incident happened on an interstate in Orlando while the officer was out of his vehicle attempting to help with a disabled vehicle.
The police officer was not hurt, however, the drivers of the other vehicles sustained minor injuries, according to the report.
The latest investigation will examine 11 crashes that occurred in the U.S. since January 2018, including ones where vehicles allegedly operating on Autopilot struck stationary police cars and fire trucks.
“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” NHTSA said in a document prepared by the Office of Defects Investigation.
“The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”
Autopilot uses on a mix of sensor devices, microprocessors and software. It was first introduced in mid-2014 and has been updated since then. The technology is designed to help keep vehicles like the Tesla Models S, X, 3 and Y in their lane, keep the vehicle safely spaced in traffic and avoid collisions.
Plenty of critics
The first crash involving Autopilot that garnered national attending happened in Florida in 2016. Former Navy SEAL Joshua Brown’s Model S collided with the side of semi-truck trailer, killing him in the process.
NHTSA and the National Transportation Safety Board both concluded that former SEAL Joshua Brown was watching a video device, rather than paying attention to the road, when his car struck a truck broadside. Regulators also determined that Autopilot failed to recognize the truck against the backdrop of a bright sky.
Since then, safety organizations and some politicians have criticized the technology, claiming the name alone suggests it’s capable of driving the vehicle without any assistance. CEO Elon Musk maintains the technology helps make the roads safer and the company goes to great lengths to tell drivers they must remain alert when behind the wheel, even when Autopilot is engaged.
Emergency responders at risk
In at least some of the 11 crashes NHTSA is looking at, Tesla vehicles believed to have been operating in Autopilot struck stationary vehicles operated by emergency responders.
Radar and even camera sensors can have problems recognizing stationary obstacles, several experts told TheDetroitBureau.com. Such situations are even more complex when cones or emergency vehicles are involved, Duke University engineering professor Mary Cummings told the Wall Street Journal.
The problem is each emergency situation presents a different visual target that systems like Autopilot may not have been trained to recognize. “This is why emergency situations are so problematic,” Cummings said. “The visual presentation is never the same.”