A California man was charged with two counts of vehicular manslaughter when the Tesla he was driving with Autopilot engaged allegedly ran a red light and crashing into another car, killing two people in 2019.
Filed in Los Angeles County in October, they are likely the first felony charges brought against a driver using semiautonomous technology. The driver, Kevin George Aziz Riad, 27, pleaded not guilty and is free on bail while the case is pending, according to USA Today.
Riad’s Model S was reportedly traveling at a high rate of speed when exiting a freeway, ran a red light, crashing into a Honda Civic on Dec. 29, 2019. The National Highway Traffic Safety Administration investigated the incident, determining Autopilot was in use at the time of the collision.
Tesla’s Autopilot technology’s been at the center of some controversy for some time now. Initially, critics said the semi-autonomous driving tech’s name implied the car can drive itself, which is untrue. CEO Elon Musk has repeatedly denied this, claiming the company makes it clear in materials about the option that drivers must remain alert and ready resume control of their vehicle at all times while it’s in use.
However, in 2020 a German court ruled the name could no longer be used because it was misleading. More recently, the San Francisco County Transit Authority (SFCTA) raised some concerns about the safety of Tesla’s Full Self-Driving software. In addition to worries about how well it performs, the SFCTA also has an issue with the name, which it believes could mislead consumers.
“We are concerned about the safety record of this service and the name of the service as it could be confusing for consumers, and hope DMV, FTC and NHTSA continue to monitor and analyze this issue to protect consumers and the traveling public,” said Tilly Chang, executive director of the SFCTA, in late September.
Additionally, two federal safety agencies — NHTSA and the National Transportation Safety Board — have conducted multiple investigations into the technology, including an ongoing one by NHTSA.
Emergency vehicle crashes
That investigation is focused on nearly a dozen crashes where Teslas reportedly with Autopilot engaged hit stationary emergency vehicles.
“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” NHTSA said. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”
Radar and even camera sensors can have problems recognizing stationary obstacles, several experts told TheDetroitBureau.com. Such situations are even more complex when cones or emergency vehicles are involved, Duke University engineering professor Mary Cummings told the Wall Street Journal.
The problem is that each emergency situation presents a different visual target that systems like Autopilot may not have been trained to recognize. “This is why emergency situations are so problematic,” Cummings said. “The visual presentation is never the same.”