Article 4TT65 NTSB Investigation Into Deadly Uber Self-Driving Car Crash Reveals Lax Attitude Toward Safety

NTSB Investigation Into Deadly Uber Self-Driving Car Crash Reveals Lax Attitude Toward Safety

by
Mark Harris
from IEEE Spectrum on (#4TT65)

The Uber car that hit and killed Elaine Herzberg in Tempe, Ariz., in March 2018 could not recognize all pedestrians, and was being driven by an operator likely distracted by streaming video, according to documents released by the U.S. National Transportation Safety Board (NTSB) this week.

But while the technical failures and omissions in Uber's self-driving car program are shocking, the NTSB investigation also highlights safety failures that include the vehicle operator's lapses, lax corporate governance of the project, and limited public oversight.

A cascade of poor design decisions led to the car being unable to properly process and respond to Herzberg's presence as she crossed the roadway with her bicycle.

This week, the NTSB released over 400 pages ahead of a 19 November meeting aimed at determining the official cause of the accident and reporting on its conclusions. The Board's technical review of Uber's autonomous vehicle technology reveals a cascade of poor design decisions that led to the car being unable to properly process and respond to Herzberg's presence as she crossed the roadway with her bicycle.

A radar on the modified Volvo XC90 SUV first detected Herzberg roughly six seconds before the impact, followed quickly by the car's laser-ranging lidar. However, the car's self-driving system did not have the capability to classify an object as a pedestrian unless they were near a crosswalk.

MzQwNzU3NQ.jpeg Illustration: NTSB/Uber ATG 2017 Volvo XC90 showing the location of sensor components supporting the ATG developmental ADS.

For the next five seconds, the system alternated between classifying Herzberg as a vehicle, a bike and an unknown object. Each inaccurate classification had dangerous consequences. When the car thought Herzberg a vehicle or bicycle, it assumed she would be travelling in the same direction as the Uber vehicle but in the neighboring lane. When it classified her as an unknown object, it assumed she was static.

Worse still, each time the classification flipped, the car treated her as a brand new object. That meant it could not track her previous trajectory and calculate that a collision was likely, and thus did not even slow down. Tragically, Volvo's own City Safety automatic braking system had been disabled because its radars could have interfered with Uber's self-driving sensors.

MzQwNzYwNw.jpeg Illustration: NTSB Aerial view of crash location showing the path of the pedestrian as she crossed N. Mill Avenue, and the movement and speed of the vehicle at three points before impact. The pedestrian's path depicts the position at time of the initial detection (5.6 seconds) and the corresponding times with the vehicle's position.

By the time the XC90 was just a second away from Herzberg, the car finally realized that whatever was in front of it could not be avoided. At this point, it could have still slammed on the brakes to mitigate the impact. Instead, a system called "action suppression" kicked in.

This was a feature Uber engineers had implemented to avoid unnecessary extreme maneuvers in response to false alarms. It suppressed any planned braking for a full second, while simultaneously alerting and handing control back to its human safety driver. But it was too late. The driver began braking after the car had already hit Herzberg. She was thrown 23 meters (75 feet) by the impact and died of her injuries at the scene.

Four days after the crash, at the same time of night, Tempe police carried out a rather macabre re-enactment. While an officer dressed as Herzberg stood with a bicycle at the spot she was killed, another drove the actual crash vehicle slowly towards her. The driver was able to see the officer from at least 194 meters (638 feet) away.

Key duties for Uber's 254 human safety drivers in Tempe were actively monitoring the self-driving technology and the road ahead. In fact, recordings from cameras in the crash vehicle show that the driver spent much of the ill-fated trip looking at something placed near the vehicle's center console, and occasionally yawning or singing. The cameras show that she was looking away from the road for at least five seconds directly before the collision.

The NTSB found that Uber's self-driving car division "did not have a standalone operational safety division or safety manager. Additionally, [it] did not have a formal safety plan, a standardized operations procedure (SOP) or guiding document for safety."

Police investigators later established that the driver had likely been streaming a television show on her personal smartphone. Prosecutors are reportedly still considering criminal charges against her.

Uber's Tempe facility, nicknamed "Ghost Town," did have strict prohibitions against using drugs, alcohol or mobile devices while driving. The company also had a policy of spot-checking logs and in-dash camera footage on a random basis. However, Uber was unable to supply NTSB investigators with documents or logs that revealed if and when phone checks were performed. The company also admitted that it had never carried out any drug checks.

Originally, the company had required two safety drivers in its cars at all times, with operators encouraged to report colleagues who violated its safety rules. In October 2017, it switched to having just one.

The investigation also revealed that Uber didn't have a comprehensive policy on vigilance and fatigue. In fact, the NTSB found that Uber's self-driving car division "did not have a standalone operational safety division or safety manager. Additionally, [it] did not have a formal safety plan, a standardized operations procedure (SOP) or guiding document for safety."

Instead, engineers and drivers were encouraged to follow Uber's core values or norms, which include phrases such as: "We have a bias for action and accountability"; "We look for the toughest challenges, and we push"; and, "Sometimes we fail, but failure makes us smarter."

NTSB investigators found that state of Arizona had a similarly relaxed attitude to safety. A 2015 executive order from governor Doug Ducey established a Self-Driving Vehicle Oversight Committee. That committee met only twice, with one of its representatives telling NTSB investigators that "the committee decided that many of the [laws enacted in other states] stifled innovation and did not substantially increase safety. Further, it felt that as long as the companies were abiding by the executive order and existing statutes, further actions were unnecessary."

When investigators inquired whether the committee, the Arizona Department of Transportation, or the Arizona Department of Public Safety had sought any information from autonomous driving companies to monitor the safety of their operations, they were told that none had been collected.

As it turns out, the fatal collision was far from the first crash that Uber's 40 self-driving cars in Tempe had been involved in. Between September 2016 and March 2018, the NTSB learned there had been 37 other crashes and incidents involving Uber's test vehicles in autonomous mode. Most were minor rear-end fender-benders, but on one occasion, a test vehicle drove into a bicycle lane bollard. Another time, a safety driver had been forced to take control of the car to avoid a head-on collision. The result: the car struck a parked vehicle.

zQbgbovjyZk
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/IeeeSpectrum
Feed Title IEEE Spectrum
Feed Link https://spectrum.ieee.org/
Reply 0 comments