The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail

MIT researchers look under the road to aid self-driving cars

By Adam Conner-Simons | MIT News | February 24, 2020

MIT CSAIL snow self-driving car

Researchers have developed sensors to help self-driving cars better navigate in snow. Source: MIT CSAIL

Car companies and researchers have been feverishly working to improve the technologies behind self-driving cars. But so far, even the most high-tech vehicles still fail when it comes to safely navigating in rain and snow.

This is because these weather conditions wreak havoc on the most common approaches for sensing, which usually involve either lidar sensors or cameras. In the snow, for example, cameras can no longer recognize lane markings and traffic signs, while the lasers of lidar sensors malfunction when rain, snow, or sleet are flying down from the sky.

design of LGPR MIT CSAIL

Parts of the localizing ground-penetrating radar system. Source: MIT CSAIL

MIT researchers have recently been wondering whether an entirely different approach might work. Specifically, what if we instead looked under the road?

Ground-penetrating radar

A team from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) developed a new system that uses an existing technology called “ground-penetrating radar” (GPR) to send electromagnetic pulses underground that measure that area’s specific combination of soil, rocks and roots. The mapping process creates a unique fingerprint of sorts that the car can later use to localize itself when it returns to that particular plot of land. Specifically, the CSAIL team used a particular form of GPR instrumentation developed at the MIT Lincoln Laboratory called “localizing ground-penetrating radar,” or LGPR. (A startup called WaveSense is also aiming to commercialize on this technology).

“If you or I grabbed a shovel and dug it into the ground, all we’re going to see is a bunch of dirt,” said CSAIL Ph.D. student Teddy Ort, lead author on a new paper about the project that will be published in the IEEE Robotics and Automation Letters journal later this month. “But LGPR can quantify the specific elements there and compare that to the map it’s already created, so that it knows exactly where it is, without needing cameras or lasers.”

Rain soaks into ground

In tests, the team found that in snowy conditions the navigation system’s average margin of error was on the order of only about an inch compared to clear weather. The researchers were surprised to find that LGPR had a bit more trouble with rainy conditions, but was still only off by an average of 5.5 inches. This is because rain leads to more water soaking into the ground, leading to a larger disparity between the original mapped LGPR reading and the current condition of the soil.

The researchers said LGPR’s robustness was further validated by the fact that, over a period of six months of testing, they never had to unexpectedly step in to take the wheel.

“Our work demonstrates that this approach is actually a practical way to help self-driving cars navigate poor weather without actually having to be able to ‘see’ in the traditional sense using laser scanners or cameras,” said MIT professor Daniela Rus, senior author on the new paper, which will also be presented in May at the International Conference on Robotics and Automation (ICRA) in Paris.

While researchers have only tested the system at low speeds on a closed country road, Ort said existing work from the Lincoln Laboratory suggests that the system could easily be extended to highways and other high-speed areas.

This is the first time that developers of self-driving systems have employed ground-penetrating radar, which has previously been used in fields such as construction planning, land mine detection, and even lunar exploration. The approach wouldn’t be able to work completely on its own, since it can’t detect things above ground. But its ability to localize in bad weather means it would couple nicely with lidar and vision approaches.

MIT CSAIL ground-penetrating radar

The system is less precise with rainy conditions, as water affects the ground signature. Source: MIT CSAIL

“Before releasing autonomous vehicles on public streets, localization and navigation have to be totally reliable at all times,” said Roland Siegwart, a professor of autonomous systems at ETH Zurich, who was not involved in the project. “The CSAIL team’s innovative and novel concept has the potential to push autonomous vehicles much closer to real-world deployment.”

MIT CSAIL ground-penetrating radar

The current system is on a hitch on the back of a car; further design advances would be needed to make it smaller and lighter. Source: MIT CSAIL

One major benefit of mapping out an area with LGPR is that underground maps tend to hold up better over time than maps created using vision or lidar, since features of an above-ground map are much more likely to change. LGPR maps also take up roughly 20% less space than the traditional 2D sensor maps that many companies use for their cars.

While the system represents an important advance, Ort said it’s far from road-ready. Future work will need to focus on designing mapping techniques that allow LGPR data sets to be stitched together to deal with multi-lane roads and intersections. In addition, the current hardware is bulky and six feet wide, so major design advances need to be made before it is small and light enough to fit into commercial vehicles.

Ort and Rus co-wrote the paper with CSAIL postdoctoral associate Igor Gilitschenski. The project was supported in part by MIT Lincoln Laboratory.

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

tiny robot on penny
Researchers create walking robot half a millimeter wide
CMU ATV
Carnegie Mellon researchers gather data to train self-driving ATVs
DeepMind’s open-source version of MuJoCo available on GitHub
MIT graphic
MIT researchers help robots navigate uncertain environments

2021 Robotics Handbook

The Robot Report Listing Database

Latest Robotics News

Robot Report Podcast

State of robotic perception with RGo Robotics' Amir Bousani
See More >

Sponsored Content

  • Meet Trey, the autonomous trailer (un)loading forklift
  • Kinova Robotics launches Link 6, the first Canadian industrial collaborative robot
  • Torque sensors help make human/robot collaborations safer for workers
  • Roller screws unlock peak performance in robotic applications
  • Making the ROS development cycle manageable

RBR50 Innovation Awards

Leave us a voicemail

The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Business Review
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Advertising
  • Contact Us

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail