The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

GIST researchers teach robots to identify partially hidden objects

By Brianna Wessling | April 22, 2022

GIST research

| Source: Gwangju Institute of Science and Technology

A research team at the Gwangju Institute of Science and Technology (GIST) in Korea created a method to allow artificial intelligence (AI) vision systems to better identify objects that are cluttered together and may not be entirely visible. 

Kyoobin Lee, an associate professor at GIST, and Seunghyeok Back, a Ph.D. student, and their research team at the university, set out to create an AI system that could identify and sort objects in cluttered scenes. The team quickly found that robotic vision systems require large datasets of objects to be able to identify objects are aren’t fully visible. 

“We expect a robot to recognize and manipulate objects they have not encountered before or been trained to recognize,” Back said. “In reality, however, we need to manually collect and label data one by one as the generalizability of deep neural networks depends highly on the quality and quantity of the training dataset.”

Typically, when AI is presented with a scene of many cluttered objects, the system will try to identify each item based only on the parts of it that are visible. Lee and Back decided to take a different approach, and instead taught the AI to recognize the geometry of each object, so that it can infer what parts of the object it can’t see. 

Teaching the AI to do this required a much smaller dataset of 45,000 photo realistic synthetic images containing depth information. When the AI is presented with a scene, it begins to understand it by picking out an object of interest and then segmented the object into a “visible mask” and an “amodal mask”.

segmentation

Comparing existing methods and the researcher’s method of identifying partially obscured objects. | Source: Gwangju Institute of Science and Technology

After segmenting the scene, the AI uses a hierarchical occlusion modeling (HOM) scheme. The scheme ranks combinations of features that could be obscured by how likely they may be present. The team tested its HOM scheme against three benchmarks and found that it achieved state-of-the-art performance. 

“Previous methods are limited to either detecting only specific types of objects or detecting only the visible regions without explicitly reasoning over occluded areas,” Back said. “By contrast, our method can infer the hidden regions of occluded objects like a human vision system. This enables a reduction in data collection efforts while improving performance in a complex environment.”

The research team’s results were accepted at the 2022 IEEE International Conference on Robotics and Automation. 

About The Author

Brianna Wessling

Brianna Wessling is an Associate Editor, Robotics, WTWH Media. She joined WTWH Media in November 2021, after graduating from the University of Kansas with degrees in Journalism and English. She covers a wide range of robotics topics, but specializes in women in robotics, robotics in healthcare, and space robotics.

She can be reached at [email protected]

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

Headshot of Mike Leblanc with podcast logo.
From combat to space: Foundation Robotics’ Mike LeBlanc talks humanoids
The ReBeLMove Pro is a versatile AMR platform.
igus designs ReBeLMove modular mobile robot, new energy chain
Antioch co-founders Harry Mellsop, Alex Langshur, Colton Swingle, and Collin Schlager.
Antioch raises pre-seed funding to accelerate AI robotics testing
The ARM Institute hosted visitors at its Mill 19 headquarters during its annual member meeting.
ARM Institute issues final project call for defense systems

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Supporting the future of medical robotics with smarter motor solutions
  • YUAN Unveils Next-Gen AI Robotics Powered by NVIDIA for Land, Sea & Air
  • ASMPT chooses Renishaw for high-quality motion control
  • Revolutionizing Manufacturing with Smart Factories
  • How to Set Up a Planetary Gear Motion with SOLIDWORKS
The Robot Report
  • Automated Warehouse
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe