Self-driving Uber car kills pedestrian

Daisuke Wakabayashi, New York Times:

> Arizona officials saw opportunity when Uber and other companies began testing driverless cars a few years ago. Promising to keep oversight light, they invited the companies to test their robotic vehicles on the state’s roads. > > Then on Sunday night, an autonomous car operated by Uber — and with an emergency backup driver behind the wheel — struck and killed a woman on a street in Tempe, Ariz. It was believed to be the first pedestrian death associated with self-driving technology. The company quickly suspended testing in Tempe as well as in Pittsburgh, San Francisco and Toronto.

Ugh.

> A preliminary investigation showed that the vehicle was moving around 40 miles per hour when it struck Ms. Herzberg, who was walking with her bicycle on the street. He said it did not appear as though the car had slowed down before impact and that the Uber safety driver had shown no signs of impairment. The weather was clear and dry.

How was this vehicle allowed on the road? Something as big as a person with a bicycle, and slow (the woman was walking her bike) and the autonomous camera system did not see it? Tragic. And more so, because it never should have been allowed to happen.

UPDATE: Hat tip to Loop regular Drew Leavitt for this San Francisco Chronicle article, titled Exclusive: Tempe police chief says early probe shows no fault by Uber. From the article:

> Pushing a bicycle laden with plastic shopping bags, a woman abruptly walked from a center median into a lane of traffic and was struck by a self-driving Uber operating in autonomous mode. > > “The driver said it was like a flash, the person walked out in front of them,” said Sylvia Moir, police chief in Tempe, Ariz., the location for the first pedestrian fatality involving a self-driving car. “His first alert to the collision was the sound of the collision.”

Sounds like no driver could have avoided this accident. But:

> Traveling at 38 mph in a 35 mph zone on Sunday night, the Uber self-driving car made no attempt to brake, according to the Police Department’s preliminary investigation.

The car was exceeding the speed limit (barely, true, but the autonomous programming allows this?) and it made no attempt to brake. Which tells me that the system did not detect the pedestrian. Seems to me an autonomous driving system should be able to react in milliseconds. Was this a blind spot in the system?

All this said, I do get that no system is perfect. And that an autonomous system has the chance to be much, much better than the human driver it replaces. But when something like car accidents happen, it makes me feel like a flaw has been exposed, and an opportunity to improve is ours for the taking.