Uber Self-Driving Crash

post by jefftk (jkaufman) · 2019-11-07T15:00:01.625Z · LW · GW · 1 comments

Content warning: discussion of death

A year and a half ago an Uber self-driving car hit and killed Elaine Herzberg. I wrote at the time:

The dashcam video from the Uber crash has been released. It's really bad. The pedestrian is slowly walking their bike left to right across a two lane street with streetlights, and manages to get to the right side of the right lane before being hit. The car doesn't slow down at all. A human driver would have vision with more dynamic range than this camera, and it looks to me like they would have seen the pedestrian about 2s out, time to slow down dramatically even if not stop entirely. But that doesn't matter here, because this car has LIDAR, which generates its own light. I'm expecting that when the full sensor data is released it will be very clear that the system had all the information it needed to stop in time.

This is the sort of situation where LIDAR should shine, equivalent to a driver on an open road in broad daylight. That the car took no action here means things are very wrong with their system. If it were a company I trusted more than Uber I would say "at least two things going wrong, like not being able to identify a person pushing a bike and then not being cautious enough about unknown input" but with Uber I think they may be just aggressively pushing out immature tech.

On Tuesday the NTSB released their report (pdf) and it's clear that the system could easily have avoided this accident if it had been better designed. Major issues include:

This is incredibly bad, applying "quick, get it working even if it's kind of a hack" programming in a field where failure has real consequences. Self-driving cars have the potential to prevent hundreds of thousands of deaths a year, but this sort of reckless approach does not help.

(Disclosure: I work at Google, which is owned by Alphabet, which owns Waymo, which also operates driverless cars. I'm speaking only for myself, and don't know anything more than the general public does about Waymo.)

Comment via: facebook

1 comments

Comments sorted by top scores.

comment by drethelin · 2019-12-03T00:52:09.603Z · LW(p) · GW(p)

It is especially the case that it should prioritize braking in uncertainty when it's a lonely road with no one behind