The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail

VirtualHome simulator could teach robots about household tasks

By Steve Crowe | May 30, 2018

Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and the University of Toronto are developing a 3D simulator that could eventually teach robots how to complete household tasks like making coffee or setting the table.

“VirtualHome” is a system that simulates detailed household tasks and has artificial “agents” execute them. Using crowdsourcing, the team tested VirtualHome on more than 3,000 programs of various activities that are further broken down into subtasks for the computer to understand.

[Paper: VirtualHome Simulating Household Activities via Programs]

Unlike humans, robots need more explicit instructions to complete tasks as they can’t just infer and reason. A simple task like making coffee, for example, would include steps like “open the cabinet” and “grab a cup.” For another example, a task like “watch TV” might include steps like “walk to the TV”, “switch on the TV”, “walk to the sofa” and “sit on the sofa.”

VirtualHome creates a database of chores using natural language

To create detailed descriptions for each VirtualHome task, the researchers collected verbal descriptions of household activities and translated them into simple code. The team fed the programs to the VirtualHome 3D simulator to be turned into videos and executed by a virtual agent.

“Describing actions as computer programs has the advantage of providing clear and unambiguous descriptions of all the steps needed to complete a task,” said PhD student Xavier Puig, who was lead author on the paper. “These programs can instruct a robot or a virtual character, and can also be used as a representation for complex tasks with simpler actions.”

VirtualHome

The researchers said this opens up the possibility of one day teaching robots to do such tasks. For now, however, this project essentially has created a large database of household tasks described using natural language. Amazon, which is developing a consumer robot for homes, could potentially use data like this to train their models to do more complex tasks.

Training robots by watching YouTube videos

Of the 3,000 programs, the team’s AI agent can already execute 1,000 separate sets of actions in eight different scenes, which include a living room, kitchen, dining room, bedroom, and home office.

“This line of work could facilitate true robotic personal assistants in the future,” said Qiao Wang, a research assistant in arts, media, and engineering at Arizona State University who was not involved in the research. “Instead of each task programmed by the manufacturer, the robot can learn tasks just by listening to or watching the specific person it accompanies. This allows the robot to do tasks in a personalized way, or even some day invoke an emotional connection as a result of this personalized learning process.”

In the future, the team hopes to train the robots using actual videos instead of Sims-style simulation videos, which would enable a robot to learn simply by watching a YouTube video. The team is also working on implementing a reward-learning system in which the agent gets positive feedback when it does tasks correctly.

“You can imagine a setting where robots are assisting with chores at home and can eventually anticipate personalized wants and needs, or impending action,” said Puig. “This could be especially helpful as an assistive technology for the elderly, or those who may have limited mobility.”

VirtualHome

About The Author

Steve Crowe

Steve Crowe is Editorial Director, Robotics, WTWH Media, and co-chair of the Robotics Summit & Expo. He joined WTWH Media in January 2018 after spending four-plus years as Managing Editor of Robotics Trends Media. He can be reached at [email protected]

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

myoshirt
ETH Zurich develops wearable muscles
csail simulation
MIT CSAIL releases open-source simulator for autonomous vehicles
A3 robots
Robot sales hit record high in first quarter of 2022
tiny robot on penny
Researchers create walking robot half a millimeter wide

2021 Robotics Handbook

The Robot Report Listing Database

Latest Robotics News

Robot Report Podcast

Brian Gerkey from Open Robotics discusses the development of ROS
See More >

Sponsored Content

  • Magnetic encoders support the stabilization control of a self-balancing two-wheeled robotic vehicle
  • How to best choose your AGV’s Wheel Drive provider
  • Meet Trey, the autonomous trailer (un)loading forklift
  • Kinova Robotics launches Link 6, the first Canadian industrial collaborative robot
  • Torque sensors help make human/robot collaborations safer for workers

RBR50 Innovation Awards

Leave us a voicemail

The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Business Review
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Advertising
  • Contact Us

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail