An autonomous car being tested by Uber hit and killed a pedestrian Sunday in Tempe, Arizona. This is believed to be the first time a fully autonomous vehicle has killed someone, though not the first time one has been involved in an accident. Uber responded by suspending all of its autonomous car testing in the U.S. and Canada. From the Washington Post:

The vehicle was in autonomous mode at the time of the crash, though a driver was behind the wheel, Tempe police said in a statement. The crash occurred about 10 p.m. Sunday in the area of Curry Road and Mill Avenue, a busy intersection with multiple lanes in every direction.

Police said the vehicle was northbound on Curry Road when a woman, identified as 49-year-old Elaine Herzberg, crossing from the west side of street, was struck. She died at a hospital, the department said.

Missy Cummings, a robotics expert at Duke University who has been critical of the swift rollout of driverless technology across the country, said the computer-vision systems for self-driving cars are “deeply flawed” and can be “incredibly brittle,” particularly in unfamiliar circumstances…

Tempe police said Herzberg was “walking outside of the crosswalk” when she was struck.

“Just because you map an area doesn’t mean your computer system is necessarily going to pick up a pedestrian, particularly one that wasn’t in a cross walk,” Cummings said.

An Uber self-driving car was involved in a crash in Tempe one year ago. At the time, police determined the Uber car had the right of way and the human driver was at fault for failing to yield. Nevertheless, Uber suspended its testing program in the wake of the crash.

That suspension appears to have only lasted one weekend before Uber resumed testing. Other autonomous vehicles have also been involved in crashes. Two years ago a Google autonomous car hit a bus in the Bay Area.

And in May of 2016, the driver of a Tesla Model S was killed while the car was in semi-autonomous ‘autopilot’ mode. A review by the National Highway Traffic Safety Administration later found the car’s systems were not responsible:

The investigation was set off by the accident that killed Joshua Brown, a 40-year-old from Ohio. His 2015 Tesla Model S was operating under its Autopilot system on a state highway in Florida when it crashed into a tractor-trailer that was crossing the road in front of his car.

Tesla has said its camera failed to recognize the white truck against a bright sky. But the agency essentially found that Mr. Brown was not paying attention to the road. It determined he set his car’s cruise control at 74 miles per hour about two minutes before the crash, and should have had at least seven seconds to notice the truck before crashing into it.

Neither Autopilot nor Mr. Brown hit the brakes. The agency said that although Autopilot did not prevent the accident, the system performed as it was designed and intended, and therefore did not have a defect.

There will be an investigation of this accident as well, but my first thought is to wonder why the human ‘backup driver’ didn’t stop the car and prevent this. Reliable self-driving cars and trucks may still be a couple years away but it’s worth pointing out that human drivers are responsible for tens of thousands of fatal accidents on the roads every year. In 2016, there were an estimated 40,200 fatal crashes. Ultimately, the question is whether the record of driverless cars turns out to be better or worse than the humans who would otherwise be at the wheel.