The Robot Report

  • Academia / Research
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • Grippers & End Effectors
    • Sensors / Sensing Systems
    • Software / Simulation
  • Design / Development
    • A.I. / Cognition
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
  • Robots / Platforms
    • AGVs
    • Consumer Robotics
    • Collaborative Robots
    • Exoskeletons
    • Self-Driving Vehicles
    • Unmanned Aerial Systems / Drones
  • Markets
    • Agriculture
    • Defense / Security
    • Logistics
    • Healthcare Robotics
    • Mining
  • Investments
  • Resources
    • Webinars
    • Digital Issues
    • Global Map
  • Events
    • Robotics Summit & Expo
    • DeviceTalks

How Uber Self-Driving Cars See the World

By Steve Crowe | March 19, 2018


Uber self-driving car

The investigation is underway for the Uber self-driving car accident that killed a pedestrian in Tempe, Arizona. Tempe police said the car was in autonomous mode at the time of the crash, was traveling about 40 MPH, and did not show any signs of slowing down.

Update on Tuesday, March 20: Tempe police chief Sylvia Moir, according to the San Francisco Chronicle, said the accident might have been unavoidable. [From viewing the videos,] “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said.

Uber certainly has a recording of the accident, so the complete story will eventually come out. Uber’s self-driving cars, like many others, are designed with redundancy in mind. This means if one sensor fails, another kicks in to do the job.

Clearly that didn’t happen here. Uber’s self-driving cars, as you can see in the graphic below, have multiple sensors that can detect pedestrians at night. But, again, police said the self-driving car showed no signs of slowing down. So it appears Uber never saw 49-year-old Elaine Herzberg, who was walking her bicycle across the street, outside of the crosswalk, at the time she was struck.

There was a human safety driver in the car to take over the wheel if necessary. But that didn’t happen, either. Rafaela Vasquez, 44, showed no signs of being impaired, police said.

So was it a complete system failure? Did multiple redundancies fail? Was it a problem with the software’s safety parameters? Only time will tell, of course. Self-driving car expert Brad Templeton speculates that, perhaps, this was an edge-case Uber was not prepared for. Templeton writes that “a person walking a bike across a non-crosswalk is an unusual thing compared to what you normally see. As such, Uber’s perception system may not be as capable of identifying and modeling that. It is something which may not have been tested as much in simulator.”

He continues, “Note that with the most advanced teams, they are doing very extensive testing of all the variations they can think of for situations like this in simulator. So this should be something the systems have seen before, at least virtually.”

Engineers can’t possibly account for every possible edge-case, but that’s where machine learning should enter the picture. Uber’s self-driving car should learn how to react in unforeseen situations by analyzing vast amounts of data. Regardless of how this accident happened, it will ignite controversy around self-driving car regulations, especially conducting tests without human safety drivers.

Sensors in Uber self-driving cars

Here’s a look at the sensor setup used in Uber’s self-driving Volvo XC90 SUV.

Uber self-driving car sensors

Sensor suite in Uber’s self-driving Volvo XC90, which was involved in a fatal accident. (Credit: Uber ATG)

LIDAR: A light detection and ranging (LIDAR) system from Velodyne is mounted on top of Uber’s self-driving cars. This produces a 360-degree, 3D image of the car’s surroundings multiple times per second. It bounces a laser off an object at an extremely high rate and measures how long the laser takes to reflect off that surface. LIDAR is quite adept at detecting static and moving objects, day or night. LIDAR has its limitations in adverse weather conditions such as fog, rain, and snow.

RADAR: Uber uses Radio Detection and Ranging (RADAR) for another 360-degree view of its surroundings. Using radio waves, RADAR sends out a detection signal at a specific frequency and waits to receive that signal back. This determines where cars and obstacles are positioned and at what speeds they are traveling. RADAR isn’t negatively affected by weather like LIDAR is, but it doesn’t relay size and shape of objects as accurately as LIDAR.

Cameras: According to Uber, its self-driving cars use short- and long-range optical cameras. Front-facing cameras focus both close and far-field, watching for braking vehicles, crossing pedestrians, traffic lights, and signage. Side and rear-facing cameras work in collaboration to construct a continuous view of the vehicle’s surroundings.

Antennae: Roof-mounted antennae provide GPS positioning and wireless data capabilities. Uber pre-maps its routes to have a high-resolution 3D map of the area. Uber’s self-driving cars then compare what they see with what’s on the map they built.

Human Safety Driver: Human safety drivers are the last line of defense in the event a self-driving system fails. Unfortunately, the human failed in this case, too. Self-driving car tests have been underway for a while now, but some states, including Arizona and California, are starting to allow companies to test self-driving cars without a person behind the steering wheel.

About The Author

Steve Crowe

Steve Crowe is Editor of The Robot Report. He joined WTWH Media in January 2018 after spending four-plus years as Managing Editor of Robotics Trends Media. He can be reached at scrowe@wtwhmedia.com

Comments

  1. Sara Stevenson says

    October 21, 2018 at 4:22 am

    How does the lidar etc impact on the people ,animals etc it detects ? If I’m living in a busy city with many self driving cars I’m likely to be scanned many,many times a day,,are there any health implications ?

    Reply

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

Machine sensing, smell to improve the self-driving experience, safety
Machine sensing, smell to improve the self-driving experience, safety
On-chip system gives keener 'eyesight' to autonomous vehicles
On-chip sub-terahertz system gives autonomous vehicles keener ‘eyesight’
waymo
Waymo’s autonomous vehicles leave Apple in the dust
Driverless investment tops $1B this week
Driverless investment tops $1.6B so far this month

The Robot Report

Get The Robot Report

Covering Microcontrollers, DSP, Networking, Analog and Digital Design, RF, Power Electronics, PCB Routing and much more

EDABoard: Forum for electronics

Tweets by RoboticTips

The Robot Report
  • Industry News
  • Subscribe
  • Advertising
  • About The Robot Report
  • Contact Us

Copyright © 2019 WTWH Media, LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media. Site Map | Privacy Policy | RSS

Search The Robot Report

  • Academia / Research
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • Grippers & End Effectors
    • Sensors / Sensing Systems
    • Software / Simulation
  • Design / Development
    • A.I. / Cognition
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
  • Robots / Platforms
    • AGVs
    • Consumer Robotics
    • Collaborative Robots
    • Exoskeletons
    • Self-Driving Vehicles
    • Unmanned Aerial Systems / Drones
  • Markets
    • Agriculture
    • Defense / Security
    • Logistics
    • Healthcare Robotics
    • Mining
  • Investments
  • Resources
    • Webinars
    • Digital Issues
    • Global Map
  • Events
    • Robotics Summit & Expo
    • DeviceTalks