The video chronicling the fatal accident between an Uber vehicle in autonomous test mode and a woman crossing a street with her bike has opened up a Pandora’s Box of questions for the ride-sharing giant and other companies looking to develop self-driving vehicles.
The first is, of course, how did the system not recognize her and take action to avoid the problem? The vehicle’s system had plenty of time to take action, but did nothing, eventually running over Elaine Herzberg, as she walked her bike across four lanes of traffic at night in Tempe, Arizona.
“The video is disturbing and heartbreaking to watch, and our thoughts continue to be with Elaine’s loved ones,” Uber said in a statement. “Our cars remain grounded, and we’re assisting local, state and federal authorities in any way we can.”
Uber’s Volvo XC90 was traveling at 38 mph in a 35 mph zone when the woman appears in the video footage released by the Tempe Police Department. Initially, Tempe Police Chief Sylvia Moir exonerated the vehicle and its back up driver saying “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how [the pedestrian] came from the shadows right into the roadway.”
(Click Here for more details about the self-driving Uber test vehicle killing a pedestrian. )
However, the department later released a statement later saying that the its role isn’t to determine fault. The National Transportation Safety Board and the National Highway Traffic Safety Administration are working with the Tempe Police Department on the investigation.
The footage shows she was outside any crosswalk but the Uber system, which is supposed to be better than humans, missed a 5-foot tall by 6-foot wide object moving in front of it. The company’s never revealed specifically how its autonomous system works, but it uses a system of cameras, radar and lidar to “see” things. While the camera would be largely ineffective at night, the radar and lidar should have been able to detect something that large.
“It seems it should have detected her,” Daniel Sperling, director of the Institute for Transportation Studies at University of California Davis told Reuters in an email after viewing the video. “It seems unlikely that a human driver would have done better. We do want AVs to do better than us and the potential exists.”
To help circumvent a problem, Uber hires humans to sit in the driver’s seat to act as backups in the event the autonomous system fails. There was a human on board during this test run. However, the video footage shows her looking down just before the accident, and though it’s not certain that she could have done anything to prevent the collision, the fact that she was not focused on the road makes impossible to know.
The accident leaves automakers and tech companies that have been testing vehicles in the real world grappling with what to do next. Sam Abuelsamid, senior analyst at Navigant Research, told TheDetroitBureau.com that its likely that other companies are going to rethink their plans to test vehicles in real-world situations.
He said the biggest value in those tests is gaining experience with other vehicles on the roads. However, he believes that a move back to simulated conditions may not be all bad.
(Toyota pauses testing of autonomous vehicles in wake of Uber fatality. Click Here for the story.)
“They’ve been testing in the real world for long enough to have plenty of data to recreate any conditions they need,” he said.
According to a Reuters/Ipsos opinion poll in January, two-thirds of Americans are uncomfortable about the idea of riding in self-driving cars.
“The greater risk for the industry is that if people feel it is unsafe, or the testing is unsafe, you’ll see a real backlash against this technology,” Matthew Johnson-Roberson, co-director of the University of Michigan Ford Center for Autonomous Vehicles, told Reuters.
Though Americans do seem to be accepting the idea of autonomous vehicles in growing numbers, overall, they’re still wary of them, and this experience is likely to put a halt to the gaining more converts, at least for the time being. However, there is a way back.
“It would be good for manufacturers to lead the way on developing those performance standards. They should say ‘here’s a minimum performance requirments for these systems and we are going to voluntarily meet those standards.’ Any company that helps do that will help restore consumer confidence,” Abuelsamid said.
Waymo, the Google spinoff, has more than 5 million miles worth of testing without a fatality. Other companies have plenty of road miles without one as well. Abuelsamid believes the states are going to need to step up into the void to facilitate change. He notes that regular drivers must undergo testing so “why should these systems be any different?”
Abuelsamid suggests that all autonomous vehicle systems go through a variety of tests to ensure they’re operating correctly, including things as simple as recognizing traffic signs, lights, etc. He believes that the simplest way for these systems to be tested is for a group like the Society of Automotive Engineers develop or at least lead the effort.
(To see more about Uber expanding autonomous truck testing in Arizona, Click Here.)
“This is what they do,” he said.