The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

Dual-armed robot learns to perform bimanual tasks from simulation

By Brianna Wessling | August 24, 2023

 

https://www.therobotreport.com/wp-content/uploads/2023/08/lifting_task.mp4

Researchers at the University of Bristol based at the Bristol Robotics Laboratory have designed a bi-touch system that allows robots to carry out manual tasks by sensing what to do from a digital helper. The system can help a bimanual robot display tactile sensitivity close to human-level dexterity using AI to inform its actions. 

The research team developed a tactile dual-arm robotic system that learns bimanual skills through Deep Reinforcement Learning (Deep-RL). This kind of learning is designed to teach robots to do things by letting them learn from trial and error, similar to training a dog with rewards and punishments. 

The team started their research by building up a virtual world that contains two robot arms equipped with tactile sensors. Next, they designed reward functions and a goal-update mechanism that could encourage the robot agents to learn to achieve the bimanual tasks. They then developed a real-world tactile dual-arm robot system to apply the agent. 

“With our Bi-Touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks [tailored to] the touch. And more importantly, we can directly apply these agents from the virtual world to the real world without further training,” lead author Yijiong Lin from the University of Bristol’s Faculty of Engineering, said. “The tactile bimanual agent can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way.”

For robotic manipulation, for example, the robot learns to make decisions by attempting various behaviors to achieve designated tasks, like lifting objects without dropping or breaking them. When the robot succeeds, it gets a prize, when it fails, it learns what not to do. 

Over time, it figures out the best ways to grab things using these rewards and punishments. The AI agent is visually blind while doing this learning, and relies only on tactile feedback and proprioceptive feedback, which is a body’s ability to sense movement, action, and location.

“Our Bi-Touch system showcases a promising approach with affordable software and hardware for learning bimanual [behaviors] with touch in simulation, which can be directly applied to the real world,” co-author Professor Nathan Lepora said. “Our developed tactile dual-arm robot simulation allows further research on more different tasks as the code will be open-source, which is ideal for developing other downstream tasks.”

Using this method, the researchers were able to successfully enable the dual-arm robot to safely lift items as fragile as a single Pringle chip. This development could be useful in industries like fruit picking and domestic service, and eventually to recreate touch in artificial limbs. 

The team’s research was published in IEEE Robotics and Automation Letters. 

About The Author

Brianna Wessling

Brianna Wessling is an Associate Editor, Robotics, WTWH Media. She joined WTWH Media in November 2021, after graduating from the University of Kansas with degrees in Journalism and English. She covers a wide range of robotics topics, but specializes in women in robotics, autonomous vehicles, and space robotics.

She can be reached at [email protected]

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

An illustration of two Franka arms picking items in simulation.
PickNik expands support for Franka Research 3 robot on MoveIt Pro
A small drone flying into fog in a dark room.
Bats inspire WPI researchers to develop drones using echolocation
Three drones work together to carry a package using a new algorithm developed at TU Delft.
TU Delft algorithm to enables drones to work together to transport heavy payloads
Mr Tung Meng Fai, Executive Director, National Robotics Programme (NRP); Professor Tan Chorh Chuan, Chairman, Agency for Science, Technology and Research (A*STAR); Ms Vanessa Yamzon Orsi, CEO, Open Source Robotics Foundation; and Dr Wang Wei, Deputy Executive Director (R&D) at A*STAR SIMTech and A*STAR ARTC, attended ROSCon on 28 October 2025.
Singapore’s National Robotics Programme reveals initiatives to advance robot adoption

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Supporting the future of medical robotics with smarter motor solutions
  • YUAN Unveils Next-Gen AI Robotics Powered by NVIDIA for Land, Sea & Air
  • ASMPT chooses Renishaw for high-quality motion control
  • Revolutionizing Manufacturing with Smart Factories
  • How to Set Up a Planetary Gear Motion with SOLIDWORKS
The Robot Report
  • Automated Warehouse
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe