The investigation is underway for the Uber self-driving car accident that killed a pedestrian in Tempe, Arizona. Tempe police said the car was in autonomous mode at the time of the crash, was traveling about 40 MPH, and did not show any signs of slowing down.
Update on Tuesday, March 20: Tempe police chief Sylvia Moir, according to the San Francisco Chronicle, said the accident might have been unavoidable. [From viewing the videos,] “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said.
Uber certainly has a recording of the accident, so the complete story will eventually come out. Uber’s self-driving cars, like many others, are designed with redundancy in mind. This means if one sensor fails, another kicks in to do the job.
Clearly that didn’t happen here. Uber’s self-driving cars, as you can see in the graphic below, have multiple sensors that can detect pedestrians at night. But, again, police said the self-driving car showed no signs of slowing down. So it appears Uber never saw 49-year-old Elaine Herzberg, who was walking her bicycle across the street, outside of the crosswalk, at the time she was struck.
There was a human safety driver in the car to take over the wheel if necessary. But that didn’t happen, either. Rafaela Vasquez, 44, showed no signs of being impaired, police said.
So was it a complete system failure? Did multiple redundancies fail? Was it a problem with the software’s safety parameters? Only time will tell, of course. Self-driving car expert Brad Templeton speculates that, perhaps, this was an edge-case Uber was not prepared for. Templeton writes that “a person walking a bike across a non-crosswalk is an unusual thing compared to what you normally see. As such, Uber’s perception system may not be as capable of identifying and modeling that. It is something which may not have been tested as much in simulator.”
He continues, “Note that with the most advanced teams, they are doing very extensive testing of all the variations they can think of for situations like this in simulator. So this should be something the systems have seen before, at least virtually.”
Engineers can’t possibly account for every possible edge-case, but that’s where machine learning should enter the picture. Uber’s self-driving car should learn how to react in unforeseen situations by analyzing vast amounts of data. Regardless of how this accident happened, it will ignite controversy around self-driving car regulations, especially conducting tests without human safety drivers.
Sensors in Uber self-driving cars
Here’s a look at the sensor setup used in Uber’s self-driving Volvo XC90 SUV.
LIDAR: A light detection and ranging (LIDAR) system from Velodyne is mounted on top of Uber’s self-driving cars. This produces a 360-degree, 3D image of the car’s surroundings multiple times per second. It bounces a laser off an object at an extremely high rate and measures how long the laser takes to reflect off that surface. LIDAR is quite adept at detecting static and moving objects, day or night. LIDAR has its limitations in adverse weather conditions such as fog, rain, and snow.
RADAR: Uber uses Radio Detection and Ranging (RADAR) for another 360-degree view of its surroundings. Using radio waves, RADAR sends out a detection signal at a specific frequency and waits to receive that signal back. This determines where cars and obstacles are positioned and at what speeds they are traveling. RADAR isn’t negatively affected by weather like LIDAR is, but it doesn’t relay size and shape of objects as accurately as LIDAR.
Cameras: According to Uber, its self-driving cars use short- and long-range optical cameras. Front-facing cameras focus both close and far-field, watching for braking vehicles, crossing pedestrians, traffic lights, and signage. Side and rear-facing cameras work in collaboration to construct a continuous view of the vehicle’s surroundings.
Antennae: Roof-mounted antennae provide GPS positioning and wireless data capabilities. Uber pre-maps its routes to have a high-resolution 3D map of the area. Uber’s self-driving cars then compare what they see with what’s on the map they built.
Human Safety Driver: Human safety drivers are the last line of defense in the event a self-driving system fails. Unfortunately, the human failed in this case, too. Self-driving car tests have been underway for a while now, but some states, including Arizona and California, are starting to allow companies to test self-driving cars without a person behind the steering wheel.
Sara Stevenson says
How does the lidar etc impact on the people ,animals etc it detects ? If I’m living in a busy city with many self driving cars I’m likely to be scanned many,many times a day,,are there any health implications ?