Wednesday I wrote about a fatal crash in Tempe, Arizona involving an autonomous Uber vehicle. At the time, authorities were saying the company didn’t appear to be at fault and that even a human-driven car might have hit the victim under the same circumstances. But since then there have been a couple of follow-up stories suggesting the car and the driver should have been able to see more than they did. Today, USA Today published a story quoting self-driving car expert who says the Uber’s LIDAR should have picked up the pedestrian in the road:
“The car’s LiDAR (light ranging and detection laser system) should have picked the pedestrian up far before it hit her,” says Raj Rajkumar, who leads the autonomous vehicle research team at Carnegie Mellon University.
“Clearly there’s a problem, because the radar also should have picked her up throughout, she was moving,” he says. “Maybe it’s the sensors not working correctly or the hardware that processes it, or the software.”…
Self-driving cars detect their surroundings with cameras, radar and LiDAR. Rajkumar says that while the car’s cameras appear to be of little value in the dark, its radar and LiDAR did not seem to behave as designed.
Uber declined to comment on Rajkumar’s observations and referred questions to the National Transportation Safety Board, which is investigating the incident along with Tempe, Ariz. police.
The NY Times also published a piece today which argues that, compared to competitors like Google, Uber’s autonomous cars are not performing nearly as well.
Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it.
So there might have been something wrong with the car’s sensors and also Uber’s autonomous fleet doesn’t seem to be as advanced as competitors. But what about the human driver?
In my previous post, I said I wasn’t sure I could have stopped because the road was so dark. From the time I first saw the victim’s shoes in the road there was just over a second before the car was striking her. That’s not a lot of time to stop a car going 40mph. Today, Ars Technica points to several videos uploaded by people driving the same street in Tempe where the accident took place and those videos don’t seem nearly as dark as the video I posted two days ago:
In the video, Herzberg’s feet become visible only about 1.4 seconds before the final frame of the video. Prior to that point, she appears shrouded in shadow.
But then people in the Tempe area started making their own videos—videos that give a dramatically different impression of that section of roadway.
In this clip, the scene of the accident comes up at about 33 seconds.
For comparison, here’s a screenshot from the Uber dash video I posted Wednesday. Notice you can just barely see the victim’s feet to the right of the dashed line:
And here’s a screencap from the video above at roughly the same point along the road:
It’s quite a dramatic difference. There’s another example posted at Ars Technica. That one isn’t as bright as the one above but it’s still substantially brighter than the Uber video. So I’m going to revise my previous assessment and say that, based on the two videos posted at AT, I believe the human ‘backup driver’ could have seen the woman crossing the street sooner if she’d been paying attention. But as I noted Wednesday, the driver was looking down at some sort of device for about 5 seconds immediately before the crash.
Before authorities let Uber or the backup driver off the hook, they need to run their own video test and, if the Uber video looks significantly darker, find out why. They also need to find out if the LIDAR was working and why it didn’t detect the pedestrian in the road.