Report: Self-driving Uber involved in fatal crash sensed pedestrian but did not react

In March there was a fatal crash involving a self-driving Uber vehicle being tested in Tempe, Arizona. The car was traveling at night and drove directly into a woman walking her bike across the road. Video of the accident showed a dark road and a human backup driver who seemed distracted by something else in the car.

Advertisement

A few days after the accident, USA Today published a story quoting Raj Rajkumar, an expert on self-driving cars from Carnegie-Mellon University. “Clearly there’s a problem, because the radar also should have picked her up throughout,” Rajkumar said. He speculated that there might have been a problem with the hardware or the software.

Today, a site called The Information has published a report stating that the car did detect the woman crossing the road but its software was tuned in such a way that it did not react to her, instead classifying her incorrectly as a false positive. From Gizmodo:

According to two sources “briefed about the matter,” The Information reports that Uber’s internal investigation found that the cameras, Lidar, and radar on the self-driving test vehicle all did their jobs correctly. Unfortunately, the system that determines which objects around the car can be safely ignored had reportedly been tuned in a way that caused it to ignore a passing pedestrian.

Why would Uber tune the software to ignore items moving across the road? According to The Information, the answer is pressure from the competition:

Advertisement

There’s a reason Uber would tune its system to be less cautious about objects around the car: It is trying to develop a self-driving car that is comfortable to ride in. By contrast, people who have recently ridden in Waymo and General Motors’ Cruise vehicles say the ride can be jerky, with sudden pumping of the brakes even though there’s no threat in sight. That’s often the result of vehicles reacting to false positives, when the sensor systems think they see moving objects that aren’t really there, or to objects that probably aren’t going to be a problem but are treated as such. Uber’s perspective has been that a self-driving car prototype that constantly slams the brakes and comes to hard stops is also dangerous.

Uber hasn’t confirmed any of this. In a statement, the company told Gizmodo it is cooperating with the NTSB and has hired a former NTSB chairman to review the safety of its self-driving vehicle program. Reuters reports the NTSB is expected to release its preliminary report on the crash in the next few weeks.

Even if the software in the car was tuned improperly, there’s still the question of why the backup driver failed to react. Video showed the driver was distracted immediately before the crash, looking down at something inside the car.

Advertisement

Also, as I noted here, several people took video of the same roadway in Tempe at night and their video seems much brighter and better lit than the accident video released by Uber. In the Uber video, the pedestrian appears in the headlights just a second or so before the crash, leaving very little time for a human driver to react.

But in the other videos uploaded to YouTube from the same street, it seems a driver would have been able to see the pedestrian much sooner if the driver was paying attention to the road.

I’d still like to know why that Uber dashcam video appears so much darker.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement
Advertisement