The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

UC San Diego, MIT researchers create Open-TeleVision, an immersive teleoperation system

By Brianna Wessling | July 7, 2024

A Unitree H1 robot performing unloading tasks using Open-TeleVision.

A Unitree H1 robot performing unloading tasks using Open-TeleVision. | Source: MIT, UC San Diego

Teleoperation can be a powerful method, not only for performing complex tasks, but also for collecting on-robot data. This data is essential for robot learning from demonstrations, as teleoperation offers accurate and precise examples, plus natural and smooth trajectories for imitation learning. These allow the learned policies to generalize to a environments, configurations, and tasks. 

Thanks to large-scale, real-robot data, learning-based robotic manipulation has advanced to a new level in the past few years, but that doesn’t mean it’s without limitations. Currently, there are two major components in most teleoperation systems: actuation and perception. 

For actuation, many engineers use joint copying to puppeteer the robot, providing high control bandwidth and precision. However, this requires the operators and the robot to be physically in the same locations, not allowing for remote control. Each piece of the robot’s hardware needs to be coupled with specific teleoperation hardware.

In addition, these systems are not yet able to operate multi-finger dexterous hands. 

The most straightforward way to handle perception is to observe the robot task space with the operator’s own eyes in a third-person or first-person view. Such an approach will inevitably result in part of the scene being occluded during teleoperation. The operator also cannot ensure the collected demonstration has captured the visual observation needed for policy learning. 

On top of that, for fine-grained manipulation tasks, it’s difficult for the teleoperator to look closely and intuitively at the object during manipulation. Displaying a third-person static camera viewer using passthrough in a virtual reality (VR) headset can result in similar challenges.

A team of researchers from the Massachusetts Institute of Technology and the University of California, San Diego, said it hopes to achieve a new level of intuitiveness and ease of use in teleoperation systems, ensuring high-quality, diverse, and scalable data. To do this, the team has proposed an immersive teleoperation system called Open-TeleVision. 


SITE AD for the 2026 Robotics Summit save the date.

How does Open-TeleVision work?

The MIT and UC San Diego team said Open-TeleVision allows operators to actively perceive the robot’s surroundings in a stereoscopic manner. Open-TeleVision is a general framework that allows users to perform teleoperation with high precision. It applies to different VR devices on different robots and manipulators and is open-source. 

The system mirrors the operator’s arm and hand movements on the robot. The team says this creates an immersive experience as if the operator’s mind is transmitted to a robot embodiment.

The researchers tested the system with two humanoid robots: the Unitree H1, which has multi-finger hands, and the Fourier GR1, which has parallel-jaw grippers. 

To validate Open-TeleVision, the team started with capturing the human operators’ hand poses and performing re-targeting to control the hands or grippers. It relied on inverse kinetics to convert the operator’s hand root position to the position of the robot arm’s end effector.

The team tested the effectiveness of the system by collecting data and training imitation-learning policies on four long-horizon precise tasks. These included can sorting, can insertion, folding, and unloading.

More dexterous robotic manipulation offers benefits

The researchers said their major contribution to allowing fine-grained manipulations comes from perception. Open-TeleVision incorporates VR systems with active visual feedback. 

To do this, the team used a single active stereo RGB camera placed on the robot’s head. The camera is equipped alongside actuation with two or three degrees of freedom, mimicking human head movement to observe a large workspace. 

During teleoperation, the camera moves along the operator’s head, streaming real-time, egocentric 3D observations to the VR device. The human operator can see what the robot sees. The researchers said this first-person active sensing brings benefits for both teleoperation and policy learning. 

For teleoperation, the system provides a more intuitive mechanism for users to explore a broader view when moving the robot’s head, allowing them to attend to the important regions for detailed interactions. For imitation learning, the policy will imitate how to move the robot head actively with manipulation-related regions. It also reduces the pixels to process for smooth, real-time, and precise close-loop control.

In addition, the MIT and UC San Diego researchers highlighted the benefits of perception that come with streaming stereoscopic video for the robot view to human eyes. This gives the operator a better spatial understanding, which is crucial for completing tasks, they said.

The team also showed how training with stereo image frames can improve the performance of the policy.

A key benefit of the system is that it enables an operator to remotely control robots via the Internet. One of the authors, MIT’s Ge Yang on the East Coast, was able to teleoperate the H1 robot at UC San Diego on the West Coast.

About The Author

Brianna Wessling

Brianna Wessling is an Associate Editor, Robotics, WTWH Media. She joined WTWH Media in November 2021, after graduating from the University of Kansas with degrees in Journalism and English. She covers a wide range of robotics topics, but specializes in women in robotics, autonomous vehicles, and space robotics.

She can be reached at [email protected]

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

An illustration of two Franka arms picking items in simulation.
PickNik expands support for Franka Research 3 robot on MoveIt Pro
A small drone flying into fog in a dark room.
Bats inspire WPI researchers to develop drones using echolocation
Three drones work together to carry a package using a new algorithm developed at TU Delft.
TU Delft algorithm to enables drones to work together to transport heavy payloads
Mr Tung Meng Fai, Executive Director, National Robotics Programme (NRP); Professor Tan Chorh Chuan, Chairman, Agency for Science, Technology and Research (A*STAR); Ms Vanessa Yamzon Orsi, CEO, Open Source Robotics Foundation; and Dr Wang Wei, Deputy Executive Director (R&D) at A*STAR SIMTech and A*STAR ARTC, attended ROSCon on 28 October 2025.
Singapore’s National Robotics Programme reveals initiatives to advance robot adoption

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Supporting the future of medical robotics with smarter motor solutions
  • YUAN Unveils Next-Gen AI Robotics Powered by NVIDIA for Land, Sea & Air
  • ASMPT chooses Renishaw for high-quality motion control
  • Revolutionizing Manufacturing with Smart Factories
  • How to Set Up a Planetary Gear Motion with SOLIDWORKS
The Robot Report
  • Automated Warehouse
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe