Following deadly crash in Arizona, Uber has recently taken its first major step towards resuming self-driving car testing by publishing a lengthy report about the company’s safety efforts.
On Friday, Uber published a 70-page safety report to the US National Highway Traffic Safety Administration, outlining the company’s subsequent steps for its self-driving car program in Pennsylvania. The report emphasized safety measures that Uber will take to prevent crashes and causalities.
Some of the safety measures detailed in the report include having two “back-up” drivers sit in the front seat to take control of the autonomous vehicle in case it runs into problems on the street. Uber is calling the back-up drivers “mission specialists.” The company will also conduct rigorous screening procedures for applicants and will monitor the backup drivers from the outside.
CEO Dara Khosrowshahi said in the report:
“We are deeply regretful for the crash in Tempe, Arizona, this March. In the hours following, we grounded our self-driving fleets in every city they were operating.”
“In the months since, we have undertaken a top-to-bottom review of ATG’s safety approaches, system development, and culture,” he adds.
Just last week, Uber disclosed that there were several issues with the tech, safety system and training of employees. Director of the Humans and Autonomy Laboratory at Duke University, Missy Cummings, said that there’s a large gap between Uber’s good intentions, and the quantitative data to back up the company’s safety claims.
Cummings says, “The report was basically content-free in providing any direct evidence.”
“It’s sufficiently vague to keep them from revealing any true weaknesses. They really don’t want any of their competitors to know where they are.”
Uber has come under severe scrutiny after one autonomous car struck and killed a pedestrian in Arizona back in March. Therefore, safety has become an issue of particular concern to Uber’s autonomous cars.
“A person died because Uber was testing its vehicles under circumstances that appear to be irresponsible at best,” said Bryant Walker Smith, assistant professor at the University of South Carolina’s law school. “This [report] is a good start. But Uber in particular should go further.”
Cummings believes that Uber should disclose data which proves the system can perform specific tasks, like reacting in time after being presented with an obstacle, or if the cars are able identify objects in “difficult and ambiguous situations.”
“That wasn’t [Uber’s] problem with the woman pushing a bike across a road — she wasn’t erratic, she was very predictable, she was very slow moving,” Cummings said.
Noah Zych, head of system safety at Uber mentioned that before the fatal crash in Arizona, Uber put too much emphasis on the role of backup safety drivers to guide the vehicle when it is faced with obstacles during a road test. Therefore, the company is planning on doing more tests in closed tracks before presenting the cars to the public.
Nonetheless, Sanjay Baruah, who is an engineering professor at Washington University, claims that the increasing numbers of autonomous cars can lead to more systemic safety regulations. This could force Uber to uphold necessary safety standards before taking its autonomous cars out on the streets.
“There is a strong likelihood that if testing on public infrastructure continues and other bad things happen there will be a strong consensus for developing some standardization or documentation for safe practices in the industry,” he said.
Uber has recently filed a request to test its autonomous vehicles in Pennsylvania, however, the application hasn’t been approved yet.
“It is our responsibility to ensure that we are developing and deploying this technology in a manner that does not introduce undue risk to the public. We must be confident our self-driving system is capable of operating safely on public roads long before it ever gets there,” Uber wrote in its safety report.