The query into a dozen crashes into emergency vehicles by Tesla vehicles using its semi-autonomous Autopilot technology took an interesting twist when federal investigators asked other auto companies with semi-autonomous tech to submit specific information.
The National Highway Traffic Safety Administration opened an investigation into Tesla’s Autopilot technology last month after Tesla vehicles reportedly using the technology crashed into 11 emergency response vehicles parked on the side of the road, rendering aid. Since then a 12th incident has occurred.
As part of its probe, investigators sent letters to General Motors, Ford, Toyota, Volkswagen and eight others, according to Reuters, asking questions designed to help with a comparative analysis.
The agency is looking for specific information or data from companies with “production vehicles equipped with the ability to control both steering and braking/accelerating simultaneously under some circumstances.”
Scope of the query
NHTSA asked the 12 automakers to report any crashes during which an advanced driver assistance system was engaged at “anytime during the period beginning 30 seconds immediately prior to the commencement of the crash.”
Additionally, according to Reuters, the agency wants details about how each system detects if a driver is engaged or paying attention while the semi-autonomous tech is in use. It also wants to know how the systems detect the use of emergency response vehicles.
The collisions, which began in January 2018, have caused any injuries, but have raised concerns with safety experts. The agency’s investigation covers more than 765,000 Tesla vehicles built in the U.S. between 2014 and 2021, Reuters reported.
“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” NHTSA said in a document prepared by the Office of Defects Investigation. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”
The Autopilot system — which relies on a mix of sensor devices, microprocessors and software — was first introduced in mid-2014. The technology is designed to help keep vehicles like the Tesla Models S, X, 3 and Y in their lane, keep the vehicle safely spaced in traffic and avoid collisions.
The probe comes at a tough time for Tesla, which just updated its Full Self-Driving from beta 9.2 to 10. The initial response some using the system was positive, noting it handled certain situations — such as making a left turn from the turning lane at an intersection more like a real person.
“Check it out! FSD Beta 10 makes an unprotected left turn against cross traffic from both directions even though tall hedges block the view on both sides. How??” asked one person on Twitter, adding, “By creeping forward just like human drivers do.”
Musk quickly tweeted back the next version of the program, Beta 10.1, “will creep forward with more confidence & quickly reverse back a little (just as a person would) if it sees danger.”
Not everyone’s experience was as confidence inspiring, although one YouTuber, according to Engadget, said it finally managed to wend its way down Lombard Street, the famously twisty street in San Francisco, without needing the driver to take over.
However, another video showed a driver a little spooked by which his vehicle headed toward a parked vehicle, saying he was “very uncomfortable so far. Ok, we didn’t hit anything but that was insane.”
Overall reviews suggest the newest Beta is a step forward from the previous version, “confident” seemingly the descriptor of choice for many, but not a radical leap toward a final solution.