The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail

How Uber Self-Driving Cars See the World

By Steve Crowe | March 19, 2018


Uber self-driving car

The investigation is underway for the Uber self-driving car accident that killed a pedestrian in Tempe, Arizona. Tempe police said the car was in autonomous mode at the time of the crash, was traveling about 40 MPH, and did not show any signs of slowing down.

Update on Tuesday, March 20: Tempe police chief Sylvia Moir, according to the San Francisco Chronicle, said the accident might have been unavoidable. [From viewing the videos,] “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said.

Uber certainly has a recording of the accident, so the complete story will eventually come out. Uber’s self-driving cars, like many others, are designed with redundancy in mind. This means if one sensor fails, another kicks in to do the job.

Clearly that didn’t happen here. Uber’s self-driving cars, as you can see in the graphic below, have multiple sensors that can detect pedestrians at night. But, again, police said the self-driving car showed no signs of slowing down. So it appears Uber never saw 49-year-old Elaine Herzberg, who was walking her bicycle across the street, outside of the crosswalk, at the time she was struck.

There was a human safety driver in the car to take over the wheel if necessary. But that didn’t happen, either. Rafaela Vasquez, 44, showed no signs of being impaired, police said.

So was it a complete system failure? Did multiple redundancies fail? Was it a problem with the software’s safety parameters? Only time will tell, of course. Self-driving car expert Brad Templeton speculates that, perhaps, this was an edge-case Uber was not prepared for. Templeton writes that “a person walking a bike across a non-crosswalk is an unusual thing compared to what you normally see. As such, Uber’s perception system may not be as capable of identifying and modeling that. It is something which may not have been tested as much in simulator.”

He continues, “Note that with the most advanced teams, they are doing very extensive testing of all the variations they can think of for situations like this in simulator. So this should be something the systems have seen before, at least virtually.”

Engineers can’t possibly account for every possible edge-case, but that’s where machine learning should enter the picture. Uber’s self-driving car should learn how to react in unforeseen situations by analyzing vast amounts of data. Regardless of how this accident happened, it will ignite controversy around self-driving car regulations, especially conducting tests without human safety drivers.

Sensors in Uber self-driving cars

Here’s a look at the sensor setup used in Uber’s self-driving Volvo XC90 SUV.

Uber self-driving car sensors

Sensor suite in Uber’s self-driving Volvo XC90, which was involved in a fatal accident. (Credit: Uber ATG)

LIDAR: A light detection and ranging (LIDAR) system from Velodyne is mounted on top of Uber’s self-driving cars. This produces a 360-degree, 3D image of the car’s surroundings multiple times per second. It bounces a laser off an object at an extremely high rate and measures how long the laser takes to reflect off that surface. LIDAR is quite adept at detecting static and moving objects, day or night. LIDAR has its limitations in adverse weather conditions such as fog, rain, and snow.

RADAR: Uber uses Radio Detection and Ranging (RADAR) for another 360-degree view of its surroundings. Using radio waves, RADAR sends out a detection signal at a specific frequency and waits to receive that signal back. This determines where cars and obstacles are positioned and at what speeds they are traveling. RADAR isn’t negatively affected by weather like LIDAR is, but it doesn’t relay size and shape of objects as accurately as LIDAR.

Cameras: According to Uber, its self-driving cars use short- and long-range optical cameras. Front-facing cameras focus both close and far-field, watching for braking vehicles, crossing pedestrians, traffic lights, and signage. Side and rear-facing cameras work in collaboration to construct a continuous view of the vehicle’s surroundings.

Antennae: Roof-mounted antennae provide GPS positioning and wireless data capabilities. Uber pre-maps its routes to have a high-resolution 3D map of the area. Uber’s self-driving cars then compare what they see with what’s on the map they built.

Human Safety Driver: Human safety drivers are the last line of defense in the event a self-driving system fails. Unfortunately, the human failed in this case, too. Self-driving car tests have been underway for a while now, but some states, including Arizona and California, are starting to allow companies to test self-driving cars without a person behind the steering wheel.

About The Author

Steve Crowe

Steve Crowe is Editorial Director, Robotics, WTWH Media, and co-chair of the Robotics Summit & Expo. He joined WTWH Media in January 2018 after spending four-plus years as Managing Editor of Robotics Trends Media. He can be reached at [email protected]

Comments

  1. Sara Stevenson says

    October 21, 2018 at 4:22 am

    How does the lidar etc impact on the people ,animals etc it detects ? If I’m living in a busy city with many self driving cars I’m likely to be scanned many,many times a day,,are there any health implications ?

    Reply

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

waymo
UN allows autonomous vehicles to drive up to 130 km/h
cruise robotaxis in San Francisco
Cruise hits milestone by charging for robotaxis rides in SF
csail simulation
MIT CSAIL releases open-source simulator for autonomous vehicles
The Cruise car in San Francisco
Nvidia patent helps autonomous cars detect emergency vehicles

2021 Robotics Handbook

The Robot Report Listing Database

Latest Robotics News

Robot Report Podcast

Brian Gerkey from Open Robotics discusses the development of ROS
See More >

Sponsored Content

  • Magnetic encoders support the stabilization control of a self-balancing two-wheeled robotic vehicle
  • How to best choose your AGV’s Wheel Drive provider
  • Meet Trey, the autonomous trailer (un)loading forklift
  • Kinova Robotics launches Link 6, the first Canadian industrial collaborative robot
  • Torque sensors help make human/robot collaborations safer for workers

RBR50 Innovation Awards

Leave us a voicemail

The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Business Review
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Advertising
  • Contact Us

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail