The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail

How Boston Dynamics’ robots learned to dance

By Steve Crowe | January 13, 2021

Listen to this article
Voiced by Amazon Polly

By now you’ve likely seen Boston Dynamics’ latest viral video in which its Atlas, Handle, and Spot robots dance to The Contours’ “Do You Love Me?” If you haven’t, or if you want to watch it again, watch it here.

One of the more common questions I received after sharing the video was: how did the robots become such great dancers? Well, we now have more information about Atlas and Spot.

Adam Savage and Tested released a video that dives into the RBR50 company‘s Choreographer software. You can watch the video above. As the Boston Dynamics engineer said in the video, the Choreographer software is similar to video editing or animation software. It works by dragging and tweaking Spot’s pre-programmed movements onto a timeline.

The video described some of the movements Spot can perform, including body, step, dynamic transition and kneeling motions. Within each of those categories, there are sub-categories of movements that can be dragged onto the timeline.

One example they go over in the video is the “Running Man” move, which we’ve seen Spot do several times now. “Each move is one step,” said Boston Dynamics. “You adjust the variable for that one step, copy and paste it, drag it into your timeline, and then combine it with other things.”

As for Atlas, learning to dance is even more complex. IEEE Spectrum has a great interview with Aaron Saunders, Boston Dynamics’ VP of Engineering, about the challenges and how human dancers were needed. From IEEE:

“We started by working with dancers and a choreographer to create an initial concept for the dance by composing and assembling a routine. One of the challenges, and probably the core challenge for Atlas in particular, was adjusting human dance moves so that they could be performed on the robot. To do that, we used simulation to rapidly iterate through movement concepts while soliciting feedback from the choreographer to reach behaviors that Atlas had the strength and speed to execute. It was very iterative — they would literally dance out what they wanted us to do, and the engineers would look at the screen and go ‘that would be easy’ or ‘that would be hard’ or ‘that scares me.’ And then we’d have a discussion, try different things in simulation, and make adjustments to find a compatible set of moves that we could execute on Atlas.

“Throughout the project, the time frame for creating those new dance moves got shorter and shorter as we built tools, and as an example, eventually we were able to use that toolchain to create one of Atlas’ ballet moves in just one day, the day before we filmed, and it worked. So it’s not hand-scripted or hand-coded, it’s about having a pipeline that lets you take a diverse set of motions, that you can describe through a variety of different inputs, and push them through and onto the robot.”

Saunders also described how Atlas’ dance moves are controlled: “Atlas’ current dance performance uses a mixture of what we like to call reflexive control, which is a combination of reacting to forces, online and offline trajectory optimization, and model predictive control. We leverage these techniques because they’re a reliable way of unlocking really high performance stuff, and we understand how to wield these tools really well. We haven’t found the end of the road in terms of what we can do with them.”

Boston Dynamics robots dancing

Arizona State University’s Heni Ben Amor, an assistant professor in the Ira A. Fulton Schools of Engineering, specializes in machine learning and human-robot interaction. He said the technology involved with Atlas dancing isn’t new. But the complexity, in this case, is unparalleled.

“In combination with the artistry of the choreography, this results in an impressive showcase of robot capabilities,” he said.

Ben Amor said mimicking the dancers’ choreography has an element of puppetry to it, but maintaining physical stability is a major part of the equation. “The robot has to think about how to actuate the motors so it can generate its own actions in space.”

“They aren’t making long-term autonomous decisions. They are making short-term decisions that enable them to reproduce what they’ve been shown in the space in which they are acting.”

“Even without any learning or AI, it is still difficult to overstate the quality of this achievement,” Ben Amor said. “A humanoid robot of this complexity, being controlled with this level of fluidity and grace, is unparalleled.”

Ben Amor’s research is machine learning methods that enable physical, human interaction. “I envision that the next stage will be developing stunt robots capable of directly interacting with humans.”

“Spiderman jumping in real life will bring robotics closer to humans,” Ben Amor said. “The day will come when the annual Boston Dynamics release will feature partnering with human dancers and performing graceful ballet lifts.”

About The Author

Steve Crowe

Steve Crowe is Editorial Director, Robotics, WTWH Media, and co-chair of the Robotics Summit & Expo. He joined WTWH Media in January 2018 after spending four-plus years as Managing Editor of Robotics Trends Media. He can be reached at [email protected]

Comments

  1. Pedro says

    January 14, 2021 at 3:24 pm

    This is only cinamtic alghorimitcs combinated with a balance system software

    Reply
  2. Jaime Oh says

    January 14, 2021 at 10:48 pm

    I dont understand how they jump. I know its obviously very complicated, but how is it possible that they jump in the air?

    Reply
  3. Sree says

    January 22, 2021 at 10:50 am

    Awsm👍👍

    Reply
  4. Jerry says

    March 23, 2021 at 1:42 pm

    Wow!

    Reply
  5. Teo says

    June 1, 2021 at 3:31 pm

    Wow

    Reply
  6. Paul C Tousley says

    October 31, 2021 at 1:27 am

    Makes me wonder how far away are they from the robots being able to take in what they are seeing and converting that to movement. Cutting the puppeteers strings

    Reply

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

John Deere autonomous tractor
John Deere acquires camera-based perception tech from Light
Rgo
RGo Robotics exits stealth with $20M in funding
Cirtronics to highlight successful commercializations at Robotics Summit
SLAMcore closes Series A to develop spatial understanding for robots

2021 Robotics Handbook

The Robot Report Listing Database

Latest Robotics News

Robot Report Podcast

Robotics Summit 2022 recap
See More >

Sponsored Content

  • Meet Trey, the autonomous trailer (un)loading forklift
  • Kinova Robotics launches Link 6, the first Canadian industrial collaborative robot
  • Torque sensors help make human/robot collaborations safer for workers
  • Roller screws unlock peak performance in robotic applications
  • Making the ROS development cycle manageable

RBR50 Innovation Awards

Leave us a voicemail

The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Business Review
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Advertising
  • Contact Us

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail