20180324 update: For now, I’ve found these two posts by Brad Templeton to be very insightful and cover some of the issues that I want to write about but Brad wrote in much more detail! Have a read, 03/20 “New facts and questions on Uber robocar fatality” & 03/21 “It certainly looks bad for Uber“. I may still add more if I see more facts of the case especially when Uber starts to voluntarily (or be compelled to) provide more of its internal technical data. I hope Uber won’t try to brush this fatality under the carpet. Will see.
I just read some news reports and watched the video of the Uber self-driving SUV fatal accident. (WARNING: Video contains disturbing images. Viewer discretion is advised.) I know I do not have full information yet so I hope to share my views (for now, semi-technical/semi-informed) on this Uber self-driving fatal accident as best as I can. And in the coming days when I have time, I hope to keep updating this post when more technical and police investigative information are available.
A bit of background first. In 2013 February (more than 5 years ago now), I was already interested in driverless technologies and already interviewed U of T Professor Emeritus C.C. Kelly Gotlieb, “Father of Computing in Canada”, to talk about many topics including Google driverless car and issues like whose to blame when an accident happened? Sadly, we now have a fatal accident on hand to talk about.
From the AP report “Experts: Uber self-driving system should have spotted woman”, this Uber self-driving SUV is using LIDAR laser sensors technology to “see”. (note: LIDAR stands for Light Detection and Ranging and it “measures distance to a target by illuminating the target with pulsed laser light” which can see perfectly well even in total darkness as it uses laser.) I made this observation re LIDAR in direct response to this sentence of the news report, “The lights on the SUV didn’t illuminate 49-year-old Elaine Herzberg on Sunday night until a second or two before impact, raising questions about whether the vehicle could have stopped in time.” And the fact the Uber safety driver was NOT paying attention to the road when he killed the 49-year-old Elaine Herzberg!
Let me quote from the AP report “Experts: Uber self-driving system should have spotted woman”,
““The victim did not come out of nowhere. She’s moving on a dark road, but it’s an open road, so Lidar (laser) and radar should have detected and classified her” as a human, said Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles.
Smith said the video may not show the complete picture, but “this is strongly suggestive of multiple failures of Uber and its system, its automated system, and its safety driver.”
Sam Abuelsmaid, an analyst for Navigant Research who also follows autonomous vehicles, said laser and radar systems can see in the dark much better than humans or cameras and that Herzberg was well within the range.
“It absolutely should have been able to pick her up,” he said. “From what I see in the video it sure looks like the car is at fault, not the pedestrian.”
Smith said that from what he observed on the video, the Uber driver appears to be relying too much on the self-driving system by not looking up at the road.
“The safety driver is clearly relying on the fact that the car is driving itself. It’s the old adage that if everyone is responsible no one is responsible,” Smith said. “This is everything gone wrong that these systems, if responsibly implemented, are supposed to prevent.”
The experts were unsure if the test vehicle was equipped with a video monitor that the backup driver may have been viewing.
Uber immediately suspended all road-testing of such autos in the Phoenix area, Pittsburgh, San Francisco and Toronto. The National Transportation Safety Board, which makes recommendations for preventing crashes, is investigating the crash.”
I will try to come back to this article and add more details and updates in the coming days when I have more time. Will see.
For now, here is the particular segment of my 5 years old 2013 interview with Prof. Gotlieb talking about “Google [and by extension, any other company’s] Driverless Car gets into an accident, whose to blame? And who can you sue? The person who wrote the program? Google who authorize the car? Car manufacture? The person who is in the car? Or all of the above? […] Lots of questions to be asked when failure happen.”