The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

This robot hand can manipulate objects without seeing them

By Brianna Wessling | May 9, 2023

Researchers from Columbia Engineering have designed a robotic hand that joins an advanced sense of touch with motor-learning algorithms, resulting in a robotic hand that doesn’t rely on vision to manipulate objects.

The robotic hand, equipped with five fingers and 15 independently actuated joints, is able to execute an arbitrarily large rotation of an unevenly shaped grasped object in hand while also maintaining the object in a stable, secure hold without any visual feedback whatsoever. This is a difficult manipulation task for robots because it requires constant repositioning of a subset of fingers, while other fingers must keep the object stable. 

“While our demonstration was on a proof-of-concept task, meant to illustrate the capabilities of the hand, we believe that this level of dexterity will open up entirely new applications for robotic manipulation in the real world,” Matei Ciocarlie, associate professor in the Departments of Mechanical Engineering and Computer Science, said. “Some of the more immediate uses might be in logistics and material handling, helping ease up supply chain problems like the ones that have plagued our economy in recent years, and in advanced manufacturing and assembly in factories.”

While robots out in the world are performing more and more complex manipulation tasks, the vast majority, if not all of them, need vision capabilities to do so. Columbia University’s robot hand is immune to lighting issues, occlusion, or other similar issues that could prevent a robot that does rely on vision capabilities from working properly. 

For this research, Ciocarlie’s team built off of previous work they did with Ioannis Kymissis, a professor of electrical engineering, where they built a new generation of optics-based tactile robot fingers. These robotic fingers were able to achieve contact localization with sub-millimeter precision while also providing complete coverage of a complex, multi-curved surface. 

Each finger on this new robotic hand is equipped with the team’s touch-sensing technology. The team, led by Gagan Khandate, Ciocarlie’s doctoral researcher, tested its ability to perform complex manipulation tasks using new methods for motor learning.

In particular, the team utilized a method called deep reinforcement learning, and augmented it with new algorithms that they developed for effective exploration of possible motor strategies. These motor learning algorithms used exclusively tactile and proprioceptive data to learn. Using these methods, the robot completed about a year of practice in only hours of real-time.  

The Columbia Engineering team then transferred the manipulation skills learned in simulation to the real robot hand, which could then achieve a high level of dexterity. 

“In this study, we’ve shown that robot hands can also be highly dexterous based on touch sensing alone. Once we also add visual feedback into the mix along with touch, we hope to be able to achieve even more dexterity, and one day start approaching the replication of the human hand,” Ciocarlie said. 

About The Author

Brianna Wessling

Brianna Wessling is an Associate Editor, Robotics, WTWH Media. She joined WTWH Media in November 2021, after graduating from the University of Kansas with degrees in Journalism and English. She covers a wide range of robotics topics, but specializes in women in robotics, autonomous vehicles, and space robotics.

She can be reached at bwessling@wtwhmedia.com

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

A robot arm with a Chinese flag overlaid.
America can’t out-innovate China without mechanical engineers — or robots
Nao robots playing soccer.
Maxvision buys core robot assets of Aldebaran, including Nao and Pepper
Headshots of charlie kemp and jan zizka, developer of Brightpick's Giraffe, with podcast logo.
Reaching new heights: How Brightpick’s Giraffe can lift warehouse efficiency
A Lucid, Nuro, Uber vehicle.
Lucid, Nuro, Uber team up on global robotaxi fleet

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.
The Robot Report Listing Database

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • How to Set Up a Planetary Gear Motion with SOLIDWORKS
  • Sager Electronics and its partners, logos shown here, will exhibit at the 2025 Robotics Summit & Expo. Sager Electronics to exhibit at the Robotics Summit & Expo
  • The Shift in Robotics: How Visual Perception is Separating Winners from the Pack
  • An AutoStore automated storage and retrieval grid. Webinar to provide automated storage and retrieval adoption advice
  • Smaller, tougher devices for evolving demands
The Robot Report
  • Automated Warehouse
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe