The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail

MIT teaches robotic hands to reorient 2,000 objects in one hand

By Brianna Wessling | November 15, 2021

Listen to this article
Voiced by Amazon Polly
In hand manipulation of objects

CSAIL’s program teaches robotic hands to manipulate objects facing upwards and downwards.

Most robotic hands don’t look or operate like human hands. They’re designed for limited, specific purposes, and need to have intimate knowledge of the object they’re handling. 

Researchers at the MIT Computer Science and Artificial Intelligence Lab (CSAIL) are working to create a framework for a realistic, robotic hand that can reorient over 2,000 different objects without prior knowledge of the object’s shape. 

There are two core frameworks at play in CSAIL’s program: student-teacher learning and gravity curriculum. 

Student-teacher learning is a training method in which researchers give a teacher network specific information about an object and its environment. The teacher learns information that a robot wouldn’t easily be able to gather in the real world, like the specific velocity of an object. 

The teacher then gives this information to the student in the form of observations that a robot could make in the real world, like depth images of an object and joint positions of the robot. The student network then has the ability to learn from those observations and apply those techniques to a number of objects. 

It was important to CSAIL’s team that a robot could handle objects with its hand facing upwards or downwards, which required extra training. Robots struggle to handle objects when having to counteract gravity and without the support of a palm under an object. 

Scientists taught the robot to counteract gravity gradually. First, they learned in a simulation without gravity. Then, researchers incrementally began to account for gravity, giving the simulation more time to learn how to hold objects even when gravity is at play. 

With these frameworks, researchers found that their program was able to learn strategies for holding and manipulating objects that don’t rely on knowing the specific shape of the object. 

“We initially thought that visual perception algorithms for inferring shape while the robot manipulates the object was going to be the primary challenge,” said MIT professor Pulkit Agrawal, an author on the paper about the research. “To the contrary, our results show that one can learn robust control strategies that are shape agnostic. This suggests that visual perception may be far less important for manipulation than what we are used to thinking, and simpler perceptual processing strategies might suffice.”

CSAIL isn’t the first research lab to try to create anthropomorphic robot hands that operate like human ones. In 2019, OpenAI developed a program that trained a robot hand to solve a Rubik’s Cube. 

While other developers had already trained robots to solve Rubik’s Cubes in seconds, OpenAI looked to train one to solve it without already knowing all possible orientations and combinations first. OpenAI researchers hoped teaching a robot to solve a Rubik’s Cube could help it to develop dexterity that could be used in handling a variety of objects. However, OpenAI recently disbanded its robotics research team due to the lack of large enough data sets to effectively generate reinforcement models.

CSAIL’s program was the most effective with simple, round objects, like marbles, with an almost 100% success rate. Not surprisingly, the program struggled the most with complex objects, like a spoon or scissors. The success rate for objects like these was 30%. 

CSAIL’s program operated entirely within simulated scenarios, but the researchers are optimistic the work can be applied to real robotic hands in the future. 

CSAIL’s full research can be found here. 

About The Author

Brianna Wessling

Brianna Wessling is an Associate Editor, Robotics, WTWH Media. She joined WTWH Media in November 2021, and is a recent graduate from the University of Kansas. She can be reached at [email protected]

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

Researchers developing underwater map-making robot
jumping robot
Jumping robot sets record with 100+ foot jump
MIT CSAIL develops soft, pneumatic actuator
GIST
GIST researchers teach robots to identify partially hidden objects

2021 Robotics Handbook

The Robot Report Listing Database

Latest Robotics News

Robot Report Podcast

Agility Robotics gets a boost from Amazon; The US Alliance of Robotics Clusters is born
See More >

Sponsored Content

  • Meet Trey, the autonomous trailer (un)loading forklift
  • Kinova Robotics launches Link 6, the first Canadian industrial collaborative robot
  • Torque sensors help make human/robot collaborations safer for workers
  • Roller screws unlock peak performance in robotic applications
  • Making the ROS development cycle manageable

RBR50 Innovation Awards

Leave us a voicemail

The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Business Review
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Advertising
  • Contact Us

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail