The concerns U.S. regulators have about the implementation of semi-autonomous or autonomous vehicles appears to be growing as safety officials really dig into the inner workings of these new technological marvels.
The National Highway Traffic Safety Administration recently urged General Motors to exercise caution in the implementation of its Super Cruise semi-autonomous system, specifically the fact that the system will slowly bring the car to a full stop if it senses the driver is sleeping, incapacitated or simply not paying attention.
NHTSA confirms the automaker’s plan to bring the car to a stop and activate the vehicle’s hazard lights if the driver is inattentive or unresponsive is compliant with federal automotive standards. However, it is also concerned the flashing lights could be misinterpreted by other drivers creating a more serious safety issue.
The Super Cruise feature controls steering, braking and acceleration in certain freeway situations. Drivers can take their hands off the steering wheel and feet off the pedals while driving. The system monitors facial cues to ensure the driver is paying attention or if the driver is asleep. In that instance, it sends visual and then audio warnings.
A letter from NHTSA to GM earlier this month related the automaker’s description of Super Cruise in this way: “If the driver is unable or unwilling to take control of the wheel (if, for example, the driver is incapacitated or unresponsive), Super Cruise may determine that the safest thing to do is to bring the vehicle slowly to a stop in or near the roadway, and the vehicle’s brakes will hold the vehicle until overridden by the driver.”
The use of the flashing hazard lights in this scenario is what caused GM to rethink its system. In fact, the company asked for an interpretation about the issue from NHTSA in March after it elected to delay its introduction of the system on the new Cadillac CT6.
NHTSA Chief Counsel Paul Hemmersbaugh said GM’s plan to use hazard lights to indicate problems after the car is stopped “is the prototypical situation in which the hazard lights are intended to be used and it is one of the situations that other motorists have come to expect when they see the hazard signal.”
(Tesla CEO Musk warns that critics of autonomous vehicles are “killing people.” For the story, Click Here.)
He added: “Any other automatic activation of hazard warning lights would need to be evaluated on a case-by-case basis. NHTSA may also consider amending the relevant provisions of (federal motor vehicle standards) at some point in the future in order to clarify situations when hazard lights may activate automatically.”
While GM originally planned for the technology to be available late this year that has been pushed back to next year. Automakers and regulators have been exercising an abundance of caution about autonomous driving systems in the wake of two fatal accidents involving Tesla’s Autopilot feature.
In fact, NHTSA sent an inquiry letter to Comma.ai, the company that developed the Comma One, an aftermarket semi-autonomous driving system set to debut late this year or early 2017. The questions so enraged its developer, George Hotz, he shut the company down rather than go through the routine procedure.
In an angry tweet he suggested he “would much rather spend my life building amazing tech than dealing with regulators and lawyers.” Hotz noted that the letter was the first time he’d heard from NHTSA, the agency tasked with making America’s Roads Safe Again, and described it as a “threat.”
(For more on the Tesla autonomous vehicle announcement, Click Here.)
The EV maker has since doubled down on the technology saying that all of its new vehicles are completely autonomous-capable, but the system wouldn’t be activated until it had cleared all of the regulatory hurdles necessary to do so.
One of those hurdles may be coming in the form of a push by Consumer Watchdog, a California-based advocacy group, to get California to expedite a rule banning “misleading advertising that leaves the dangerous – and sometimes fatal – impression that a car is more capable of driving itself than is actually the case.”
It is a direct slap at Tesla, which has endured the criticism from several corners that it has overhyped the capability of its Autopilot feature, leading to the two aforementioned fatalities. The group referred the DMV to its new video documenting how Tesla hyped its vehicles’ “Autopilot” feature, clearly leaving the false impression the cars were self-driving.
In a letter to DMV Director Jean Shiomoto Consumer Watchdog’s Privacy Project Director John Simpson wrote:
“Tesla, with its promotion of its so-called ‘Autopilot’ feature, is a prime example of the deadly consequences of such unjustified hype. Chairman Elon Musk has repeatedly extolled the Tesla’s self-driving virtues to clearly leave the impression that the vehicle is autonomous.”
The group notes that the DMV’s proposed new vehicle regulations drafted in September that would protect consumers with a section that provides that “a vehicle cannot be advertised as autonomous in California unless it meets the definition of ‘autonomous’ specified in Vehicle Code 38750 and the autonomous vehicle regulations.”
However, they believe that the regular vetting process will take too long, endangering lives in the process, thus they are asking for an expedited effort.
(To see more about Comma One being pulled from the market, Click Here.)
“That is too long to wait to stop Tesla and its CEO from risking even more lives by falsely promoting Autopilot technology as self-driving. Currently there is nothing to stop the sort of hype spouted by Elon Musk with its potentially deadly consequences,” Consumer Watchdog’s letter said. “DMV should extract the advertising regulatory language from the rest of the draft autonomous vehicle regulations and start a formal rule-making to enact that section immediately.”