The Robot Report

  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Investments
    • Funding
    • Mergers & Acquisitions
  • Resources
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness Direct
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
  • Podcast

U.S. Army robots detect, share 3D changes in real time

By Steve Crowe | August 24, 2020

Army robots

U.S. Army robots can detect physical changes in 3D and share the information in real-time. | Credit: U.S. Army

Something is different, and you can’t quite put your finger on it. But your robot can.

Even small changes in your surroundings could indicate danger. Imagine a robot could detect those changes, and a warning could immediately alert you through a display in your eyeglasses. That is what U.S. Army scientists are developing with sensors, robots, real-time change detection and augmented reality wearables.

Army researchers demonstrated in a real-world environment the first human-robot team in which the robot detects physical changes in 3D and shares that information with a human in real-time through augmented reality, who is then able to evaluate the information received and decide follow-on action.

“This could let robots inform their soldier teammates of changes in the environment that might be overlooked by or not perceptible to the soldier, giving them increased situational awareness and offset from potential adversaries,” said Dr. Christopher Reardon, a researcher at the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory. “This could detect anything from camouflaged enemy soldiers to IEDs.”

Part of the lab’s effort in contextual understanding through the Artificial Intelligence for Mobility and Maneuver Essential Research Program, this research explores how to provide contextual awareness to autonomous robotic ground platforms in maneuver and mobility scenarios. Researchers also participate with international coalition partners in the Technical Cooperation Program’s Contested Urban Environment Strategic Challenge, or TTCP CUESC, events to test and evaluate human-robot teaming technologies.

Most academic research in the use of mixed reality interfaces for human-robot teaming does not enter real-world environments, but rather uses external instrumentation in a lab to manage the calculations necessary to share information between a human and robot. Likewise, most engineering efforts to provide humans with mixed-reality interfaces do not examine teaming with autonomous mobile robots, Reardon said.

Reardon and his colleagues from the Army and the University of California, San Diego, published their research, Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection, at the 12th International Conference on Virtual, Augmented, and Mixed Reality, part of the International Conference on Human-Computer Interaction.

Army robots

The two Army robots used in the experiments are identically equipped, with the exception of Velodyne VLP-16 LiDAR (left) and Ouster OS1 LiDAR (right). | Credit: U.S. Army

The research paired a small autonomous mobile ground robot from Clearpath Robotics, equipped with laser ranging sensors (LIDAR), to build a representation of the environment, with a human teammate wearing augmented reality glasses. As the robot patrolled the environment, it compared its current and previous readings to detect changes in the environment. Those changes were then instantly displayed in the human’s eyewear to determine whether the human could interpret the changes in the environment.

In studying communication between the robot and human team, the researchers tested different resolution LIDAR sensors on the robot to collect measurements of the environment and detect changes. When those changes were shared using augmented reality to the human, the researchers found that human teammates could interpret changes that even the lower-resolution LIDARs detected. This indicates that–depending on the size of the changes expected to encounter–lighter, smaller and less expensive sensors could perform just as well, and run faster in the process.

This capability has the potential to be incorporated into future soldier mixed-reality interfaces such as the Army’s Integrated Visual Augmentation System goggles, or IVAS.

“Incorporating mixed reality into soldiers’ eye protection is inevitable,” Reardon said. “This research aims to fill gaps by incorporating useful information from robot teammates into the soldier-worn visual augmentation ecosystem, while simultaneously making the robots better teammates to the soldier.”

Future studies will continue to explore how to strengthen the teaming between humans and autonomous agents by allowing the human to interact with the detected changes, which will provide more information to the robot about the context of the change–for example, changes made by adversaries versus natural environmental changes or false positives, Reardon said. This will improve the autonomous context understanding and reasoning capabilities of the robotic platform, such as by enabling the robot to learn and predict what types of changes constitute a threat. In turn, providing this understanding to autonomy will help researchers learn how improve teaming of soldiers with autonomous platforms.

Editor’s Note: This article was republished from the U.S. Army CCDC Army Research Laboratory.

About The Author

Steve Crowe

Steve Crowe is Editor of The Robot Report and co-chair of the Robotics Summit & Expo. He joined WTWH Media in January 2018 after spending four-plus years as Managing Editor of Robotics Trends Media. He can be reached at [email protected]

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

CMU Snakebot
CMU’s snake robot can now swim underwater
MIT RF Grasp
Radio frequency perception helps robot grasp hidden objects
Using tactile-based reinforcement learning for insertion tasks
Multiple drones move a single box
Small drones team up to carry heavy packages

End-of-Arm Tooling Issue

The Robot Report Listing Database

Latest Robotics News

Robot Report Podcast

Ken Goldberg on robotic grasping; Carbon Robotics’ new weeding robot

The Robot Report Podcast · Ken Goldberg on robotic grasping; Carbon Robotics’ new weeding robot

Sponsored Content

  • Why field-of-view matters
  • FORT Robotics Podcast: FORT Robotics on how to keep humans safe and in control of robots
  • IES servo control gripper
  • How to cut the cost of manufacturing
  • Analytics: Robotics’ untapped vein of business value

Tweets by RoboticTips

The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Business Review
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Advertising
  • Contact Us

Copyright © 2021 WTWH Media, LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media. Site Map | Privacy Policy | RSS

Search The Robot Report

  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Investments
    • Funding
    • Mergers & Acquisitions
  • Resources
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness Direct
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
  • Podcast