Safety, rather than the rush to be first to market, must be fundamental to the development of autonomous vehicle technology, according to U.S. Transportation Secretary Anthony Foxx.
The government will move ahead with plans to release new guidelines covering the development of self-driving vehicles later this summer, Foxx said during a speech in San Francisco. But regulators are clearly keeping in mind the series of collisions that have involved Google autonomous vehicle prototypes and the May 9 fatal crash of a Tesla Model S being driven in semi-autonomous Autopilot mode.
“We want people who start a trip to finish it,” Foxx said. And while “Autonomous doesn’t mean perfect,” he cautioned that, “We need industry to take the safety aspects of this very seriously.”
The National Highway Traffic Safety Administration recently began holding hearings on the proposed autonomous vehicle guidelines, and while there has been strong support for the development of the technology, skeptics have also spoken up about the potential risks.
That was highlighted by a demonstration staged to coincide with Foxx’s appearance in San Francisco, the non-profit Consumer Watchdog group hiring a white truck to drive around the conference hall with the sign, “Tesla, don’t hit me.”
The group has called the Autopilot failure a “poster child” for why regulators need to take a stricter, go-it-slow approach to introducing autonomous technology. Consumer Watchdog, like Consumers Union, the publisher of Consumer Reports magazine, has demanded Tesla disable the Autopilot system.
(Consumer Reports wants Autopilot shut off. Click Here for that story.)
The California battery-carmaker has acknowledged that the Autopilot system on 40-year-old Joshua Brown’s Model S didn’t react properly when a semi-truck pulled in front of it on a Florida highway in May. The car’s camera couldn’t separate the white truck from a bright sky, and the radar system confused the truck and an overhead sign.
Tesla CEO Elon Musk has since said that the company is working to update the system working with German supplier Bosch. He has rejected the calls to disable Autopilot.
But the Florida crash – and two others potentially linked to Autopilot, as well as many more minor collisions involving Google cars – have increased concerns about the pace of the roll-out of advanced vehicle technologies, particularly those that will take on some or all driving duties.
One challenge is that the latest semi-autonomous systems still require a driver to be ready to take over in the event of the malfunction. There is some evidence that driver Brown may have been watching a video rather than keeping his eyes on the road. And there are numerous reports of Tesla owners who ignored Tesla’s guidelines.
(Click Here for details about the possibility that regulators may roll back CAFE mandates.)
Carmakers and suppliers have to assume that some motorists will “push the limits of what the manufacturer intends,” said Foxx, and take steps to minimize such risks.
Foxx said both regulators and manufacturers need to be “in sync,” in order to “assure consumers that the vehicles that they are getting into are stress-tested.”
How soon self-driving vehicles will be on the road is far from certain. Semi-autonomous technologies, such as Tesla’s Autopilot and Nissan’s ProPilot, are just beginning to roll into production. Nissan has said it hopes to have 10 different autonomous models ready to go by 2020, though many observers believe it will take longer to bring fully self-driving vehicle into the real world.
(To see more about Tesla’s effort to improve Autopilot, Click Here.)
For his part, Foxx said “some variations” of autonomous technology will be widely available during the next decade, though it could take “a couple decades, maybe more,” before such systems become the norm.