The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

How Uber Self-Driving Cars See the World

By Steve Crowe | March 19, 2018


Uber self-driving car

The investigation is underway for the Uber self-driving car accident that killed a pedestrian in Tempe, Arizona. Tempe police said the car was in autonomous mode at the time of the crash, was traveling about 40 MPH, and did not show any signs of slowing down.

Update on Tuesday, March 20: Tempe police chief Sylvia Moir, according to the San Francisco Chronicle, said the accident might have been unavoidable. [From viewing the videos,] “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said.

Uber certainly has a recording of the accident, so the complete story will eventually come out. Uber’s self-driving cars, like many others, are designed with redundancy in mind. This means if one sensor fails, another kicks in to do the job.

Clearly that didn’t happen here. Uber’s self-driving cars, as you can see in the graphic below, have multiple sensors that can detect pedestrians at night. But, again, police said the self-driving car showed no signs of slowing down. So it appears Uber never saw 49-year-old Elaine Herzberg, who was walking her bicycle across the street, outside of the crosswalk, at the time she was struck.

There was a human safety driver in the car to take over the wheel if necessary. But that didn’t happen, either. Rafaela Vasquez, 44, showed no signs of being impaired, police said.

So was it a complete system failure? Did multiple redundancies fail? Was it a problem with the software’s safety parameters? Only time will tell, of course. Self-driving car expert Brad Templeton speculates that, perhaps, this was an edge-case Uber was not prepared for. Templeton writes that “a person walking a bike across a non-crosswalk is an unusual thing compared to what you normally see. As such, Uber’s perception system may not be as capable of identifying and modeling that. It is something which may not have been tested as much in simulator.”

He continues, “Note that with the most advanced teams, they are doing very extensive testing of all the variations they can think of for situations like this in simulator. So this should be something the systems have seen before, at least virtually.”

Engineers can’t possibly account for every possible edge-case, but that’s where machine learning should enter the picture. Uber’s self-driving car should learn how to react in unforeseen situations by analyzing vast amounts of data. Regardless of how this accident happened, it will ignite controversy around self-driving car regulations, especially conducting tests without human safety drivers.

Sensors in Uber self-driving cars

Here’s a look at the sensor setup used in Uber’s self-driving Volvo XC90 SUV.

Uber self-driving car sensors

Sensor suite in Uber’s self-driving Volvo XC90, which was involved in a fatal accident. (Credit: Uber ATG)

LIDAR: A light detection and ranging (LIDAR) system from Velodyne is mounted on top of Uber’s self-driving cars. This produces a 360-degree, 3D image of the car’s surroundings multiple times per second. It bounces a laser off an object at an extremely high rate and measures how long the laser takes to reflect off that surface. LIDAR is quite adept at detecting static and moving objects, day or night. LIDAR has its limitations in adverse weather conditions such as fog, rain, and snow.

RADAR: Uber uses Radio Detection and Ranging (RADAR) for another 360-degree view of its surroundings. Using radio waves, RADAR sends out a detection signal at a specific frequency and waits to receive that signal back. This determines where cars and obstacles are positioned and at what speeds they are traveling. RADAR isn’t negatively affected by weather like LIDAR is, but it doesn’t relay size and shape of objects as accurately as LIDAR.

Cameras: According to Uber, its self-driving cars use short- and long-range optical cameras. Front-facing cameras focus both close and far-field, watching for braking vehicles, crossing pedestrians, traffic lights, and signage. Side and rear-facing cameras work in collaboration to construct a continuous view of the vehicle’s surroundings.

Antennae: Roof-mounted antennae provide GPS positioning and wireless data capabilities. Uber pre-maps its routes to have a high-resolution 3D map of the area. Uber’s self-driving cars then compare what they see with what’s on the map they built.

Human Safety Driver: Human safety drivers are the last line of defense in the event a self-driving system fails. Unfortunately, the human failed in this case, too. Self-driving car tests have been underway for a while now, but some states, including Arizona and California, are starting to allow companies to test self-driving cars without a person behind the steering wheel.

About The Author

Steve Crowe

Steve Crowe is Executive Editor, Robotics, WTWH Media, and chair of the Robotics Summit & Expo and RoboBusiness. He is also co-host of The Robot Report Podcast, the top-rated podcast for the robotics industry. He joined WTWH Media in January 2018 after spending four-plus years as Managing Editor of Robotics Trends Media. He can be reached at scrowe@wtwhmedia.com

Comments

  1. Sara Stevenson says

    October 21, 2018 at 4:22 am

    How does the lidar etc impact on the people ,animals etc it detects ? If I’m living in a busy city with many self driving cars I’m likely to be scanned many,many times a day,,are there any health implications ?

    Reply

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

A Waymo robotaxi in LA.
Waymo robotaxis to map Boston
WeRide's autonomous vehicles.
Uber investing $100M into WeRide to bring robotaxis to 15 cities
An Aurora ruck driving on a road in Texas.
Aurora begins driverless commercial trucking in Texas
Waymo is testing autonomous vehicles such as this in Austin, Texas.
Toyota, Waymo consider joint development of self-driving vehicles

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.
The Robot Report Listing Database

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Sager Electronics and its partners, logos shown here, will exhibit at the 2025 Robotics Summit & Expo. Sager Electronics to exhibit at the Robotics Summit & Expo
  • The Shift in Robotics: How Visual Perception is Separating Winners from the Pack
  • An AutoStore automated storage and retrieval grid. Webinar to provide automated storage and retrieval adoption advice
  • Smaller, tougher devices for evolving demands
  • Modular motors and gearboxes make product development simple
The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe