The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

Inside NVIDIA’s new robotics research lab

By Steve Crowe | January 15, 2019


NVIDIA CEO Jensen Huang (left) and Senior Director of Robotics Research Dieter Fox at NVIDIA’s robotics lab.

The Robot Report named NVIDIA a must-watch robotics company in 2019 due to its new Jetson AGX Xavier Module that it hopes will become the go-to brain for next-generation robots. Now there’s even more reason to keep an eye on NVIDIA’s robotics moves: the Santa Clara, Calif.-based chipmaker just opened its first full-blown robotics research lab.

Located in Seattle just a short walk from the University of Washington, NVIDIA’s robotics lab is tasked with driving breakthrough research to enable next-generation collaborative robots that operate robustly and safely among people. NVIDIA’s robotics lab is led by Dieter Fox, senior director of robotics research at NVIDIA and professor in the UW Paul G. Allen School of Computer Science and Engineering.

“All of this is working toward enabling the next generation of smart manipulators that can also operate in open-ended environments where not everything is designed specifically for them,” said Fox. “By pulling together recent advances in perception, control, learning and simulation, we can help the research community solve some of the greatest challenges in robotics.”

The 13,000-square-foot lab will be home to 50 roboticists, consisting of 20 NVIDIA researchers plus visiting faculty and interns from around the world. NVIDIA wants robots to be able to naturally perform tasks alongside people in real-world, unstructured environments. To do that, the robots need to be able to understand what a person wants to do and figure out how to help achieve a goal.

The idea for NVIDIA’s robotics lab came in the summer of 2017 in Hawaii. Fox and NVIDIA CEO Jensen Huang met at CVPR, an annual computer vision conference, and discussed the exciting areas and difficult problems ongoing in robotics.

“NVIDIA dedicates itself to solving the very difficult challenges that computing can solve. And robotics is unquestionably one of the final frontiers of artificial intelligence. It requires the convergence of so many types of technologies,” Huang told The Robot Report. “We wanted to dedicate ourselves to make a contribution to the field of robotics. Along the way it’s going to spin off all kinds of great computer science and AI knowledge. We really hope the technology that will be created will allow industries from healthcare to manufacturing to transportation and logistics to make a great advance.”

NVIDIA said there are about a dozen projects currently underway, and NVIDIA will open source its research papers. Fox said NVIDIA is primarily interested, early on at least, in sharing its software developments with the robotics community. “Some of the core techniques you see in the kitchen demo will be wrapped up into really robust components,” Fox said.

We attended the official opening of NVIDIA’s robotics research lab. Here’s a peek inside.

Mobile manipulator in the kitchen

NVIDIA robotics lab

NVIDIA’s mobile manipulator includes a Franka Emika Panda cobot on a Segway RMP 210 UGV. (Credit: NVIDIA)

The main test area inside NVIDIA’s robotics lab is a kitchen the company purchased from IKEA. A mobile manipulator, consisting of a Franka Emika Panda cobot arm on a Segway RMP 210 UGV, will try its hand at increasingly difficult tasks, ranging from from retrieving objects from cabinets to learning how to clean the dining table to helping a person cook a meal.

During the open house, the mobile manipulator consistently fetched objects and put them in a drawer, opening and closing the drawer with its gripper. Fox admitted this first task is somewhat easy. The robot uses deep learning to detect specific objects solely based on its own simulation and doesn’t require any manual data labeling. The robot uses the NVIDIA Jetson platform for navigation and performs real-time inference for processing and manipulation on NVIDIA TITAN GPUs. The deep learning-based perception system was trained using the cuDNN-accelerated PyTorch deep learning framework.

Fox also made it clear why NVIDIA chose to test a mobile manipulator in a kitchen. “The idea to choose the kitchen was not because we think the kitchen is going to be the killer app in the home,” said Fox. “It was really just a stand in for these other domains.” A kitchen is a structured environment, but Fox said it is easy to introduce new variables to the robot in the form of more complex tasks, such as dealing with unknown objects or assisting a person who is cooking a meal.”

Deep Object Pose Estimation

DOPE NVIDIA robotics lab

NVIDIA Deep Object Pose Estimation (DOPE) system. (Credit: NVIDIA)

NVIDIA introduced its Deep Object Pose Estimation (DOPE) system in October 2018 and it was on display in Seattle. With NVIDIA’s algorithm and a single image, a robot can infer the 3D pose of an object for the purpose of grasping and manipulation. DOPE was trained solely on synthetic data.

One of the key challenges of synthetic data is the ability to bridge the reality gap so that networks trained on synthetic data operate correctly with real-world data. NVIDIA said its one-shot deep neural network, albeit on a limited basis, has accomplished that. The system approaches its grasps in two steps. First, the deep neural network estimates belief maps of 2D keypoints of all the objects in the image coordinate system. Next, peaks from these belief maps are fed to a standard perspective-n-point (PnP) algorithm to estimate the 6-DoF pose of each object instance.

Read our interview about the DOPE system with Stan Birchfield, a Principal Research Scientist at NVIDIA, here.

NVIDIA robotics lab

ReFlex TakkTile 2 gripper from RightHand Robotics.

Tactile sensing

NVIDIA had two demos showcasing tactile sensing, which is a missing element for commercialized robotic grippers. One demo featured a ReFlex TakkTile 2 gripper from RightHand Robotics, which recently raised $23 million for its piece-picking technology. The ReFlex TakkTile 2 is a ROS-compatible robotic gripper with three fingers. The gripper has three bending DOF and 1 coupled rotational DOFs. Sensing capabilities include normal pressure sensors, rotational proximal joint encoders, and fingertip IMUs.

The other demo, run by NVIDIA senior robotics researcher Karl Van Wyk, featured SynTouch tactile sensors retrofitted onto an Allegro robotic hand from South Korea-based Wonik Robotics and a KUKA LBR iiwa cobot. “It almost feels like a pet!” said Huang as he gently touched the robotic fingers, causing them to pull back. “It’s surprisingly therapeutic. Can I have one?”

Van Wyk said tactile sensors are starting to trickle out of research labs and into the real world. “There is a lot of hardening and integration that needs to happen to get them to hold up in the real world, but we’re making a lot of progress there. The world we live in is designed for us, not robots.”

The KUKA LBR iiwa wasn’t using any vision to sense its environment. “The robot can’t see that we’re around it, but we want it be constantly sensing and reacting to its environment. The arm has torque sensing in all of the joints, so it can feel that I’m pushing on it and react to that. It doesn’t need to see me to react to me.

“We have a 16-motor hand over with three primary fingers and an opposable thumb, so it’s like our hands. The reason you want a more complicated gripper like this is you want to eventually be able to manipulate objects in your hands like we do on an daily basis. It is very useful and makes solving physical tasks more efficient. The SynTouch sensors measure what’s going on when we’re touching and manipulating something. Keying off those sensors is important for control. If we can feel the object, we can re-adjust the grip and the finger location.”

Human-robot interaction

HRI NVIDIA robotics lab

Huang tests a control system that enables a robots to mimic human movements. (Credit: NVIDIA)

Another interesting demo was NVIDIA’s “Proprioception Robot,” which is the work of Dr. Madeline Gannon, a multidisciplinary designer nicknamed the “Robot Whisperer” who is inventing better ways to communicate with robots. Using a two-armed ABB YuMi and a Microsoft Kinect on the floor underneath the robot, the system would mimic the movements of the human in front of it.

“With YuMi, you don’t need a roboticist to program a robot. Using NVIDIA’s motion generated algorithms, we can have engaging experiences with lifelike robots.”

You might have heard of Gannon’s recent work at the World Economic Forum in September 2018. She installed 10 industrial robot arms in a row, linking them to a single through a central controller. Using depth sensors at the bases of the robots, they tracked and responded to the movements of people passing by.

“There are so many interesting things that we could spin off in our pursuit of a general AI robot,” said Huang. “For example, it’s very likely that in the near future you’ll have ‘exo-vehicles’ around you, whether it’s an exoskeleton or an exo-something that helps people who are disabled, or helps us be stronger than we are.”

About The Author

Steve Crowe

Steve Crowe is Executive Editor, Robotics, WTWH Media, and chair of the Robotics Summit & Expo and RoboBusiness. He is also co-host of The Robot Report Podcast, the top-rated podcast for the robotics industry. He joined WTWH Media in January 2018 after spending four-plus years as Managing Editor of Robotics Trends Media. He can be reached at scrowe@wtwhmedia.com

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

The Northeastern team that won the MassRobotics Form & Function Challenge.
Northeastern soft robotic arm wins MassRobotics Form & Function Challenge at Robotics Summit
A FANUC robot working in car manufacturing.
U.S. automotive industry increased robot installations by 10% in 2024
A robot arm with a two-fingered gripper picking up a cup next to a sink.
Cornell University teaches robots new tasks from how-to videos in just 30 minutes
A comparison shot shows the relative size of the current RoboBee platform with a penny, a previous iteration of the RoboBee, and a crane fly.
Harvard equips its RoboBee with crane fly-inspired landing gear

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.
The Robot Report Listing Database

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Sager Electronics and its partners, logos shown here, will exhibit at the 2025 Robotics Summit & Expo. Sager Electronics to exhibit at the Robotics Summit & Expo
  • The Shift in Robotics: How Visual Perception is Separating Winners from the Pack
  • An AutoStore automated storage and retrieval grid. Webinar to provide automated storage and retrieval adoption advice
  • Smaller, tougher devices for evolving demands
  • Modular motors and gearboxes make product development simple
The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe