The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail

Filter gives robots greater spatial perception for 6D object pose estimation

By The Robot Report Staff | July 14, 2019


Robots are good at making identical repetitive movements, such as a simple task on an assembly line. Pick up a cup. Turn it over. Put it down. But they lack the ability to perceive objects as they move through an environment. A human picks up a cup, puts it down in a random location, and the robot must retrieve it.

A recent study was conducted by researchers at the University of Illinois at Urbana-Champaign, NVIDIA, the University of Washington, and Stanford University, on 6D object pose estimation to develop a filter to give robots greater spatial perception so they can manipulate objects and navigate through space more accurately.

While 3D pose provides location information on X, Y, and Z axes – relative location of the object with respect to the camera – 6D pose gives a much more complete picture.

“Much like describing an airplane in flight, the robot also needs to know the three dimensions of the object’s orientation – its yaw, pitch, and roll,” said Xinke Deng, doctoral student studying with Timothy Bretl, an associate professor in the Dept. of Aerospace Engineering at U of I.

And in real-life environments, all six of those dimensions are constantly changing.

“We want a robot to keep tracking an object as it moves from one location to another,” Deng said.

Deng explained that the work was done to improve computer vision. He and his colleagues developed a filter to help robots analyze spatial data. The filter looks at each particle, or piece of image information collected by cameras aimed at an object to help reduce judgement errors.

Overview of the PoseRBPF framework for 6D object pose tracking. The method leverages a Rao-Blackwellized particle filter and an auto-encoder network to estimate the 3D translation and a full distribution of the 3D rotation of a target object from a video sequence. | Credit: University of Illinois Department of Aerospace Engineering

“In an image-based 6D pose estimation framework, a particle filter uses a lot of samples to estimate the position and orientation,” Deng said. “Every particle is like a hypothesis, a guess about the position and orientation that we want to estimate. The particle filter uses observation to compute the value of importance of the information from the other particles. The filter eliminates the incorrect estimations.

“Our program can estimate not just a single pose but also the uncertainty distribution of the orientation of an object,” Deng said. “Previously, there hasn’t been a system to estimate the full distribution of the orientation of the object. This gives important uncertainty information for robot manipulation.”

The study uses 6D object pose tracking in the Rao-Blackwellized particle filtering framework, where the 3D rotation and the 3D translation of an object are separated. This allows the researchers’ approach, called PoseRBPF (PDF), to efficiently estimate the 3D translation of an object along with the full distribution over the 3D rotation. As a result, PoseRBPF can track objects with arbitrary symmetries while still maintaining adequate posterior distributions.

“Our approach achieves state-of-the-art results on two 6D pose estimation benchmarks,” Deng said.

Editor’s Note: This article was republished from The Grainger College of EngineeringUniversity of Illinois at Urbana-Champaign.

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

myoshirt
ETH Zurich develops wearable muscles
csail simulation
MIT CSAIL releases open-source simulator for autonomous vehicles
A3 robots
Robot sales hit record high in first quarter of 2022
tiny robot on penny
Researchers create walking robot half a millimeter wide

2021 Robotics Handbook

The Robot Report Listing Database

Latest Robotics News

Robot Report Podcast

Anders Beck introduces the UR20; California bans autonomous tractors
See More >

Sponsored Content

  • Magnetic encoders support the stabilization control of a self-balancing two-wheeled robotic vehicle
  • How to best choose your AGV’s Wheel Drive provider
  • Meet Trey, the autonomous trailer (un)loading forklift
  • Kinova Robotics launches Link 6, the first Canadian industrial collaborative robot
  • Torque sensors help make human/robot collaborations safer for workers

RBR50 Innovation Awards

Leave us a voicemail

The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Business Review
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Advertising
  • Contact Us

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail