Listen to this article
|
By now you’ve likely seen Boston Dynamics’ latest viral video in which its Atlas, Handle, and Spot robots dance to The Contours’ “Do You Love Me?” If you haven’t, or if you want to watch it again, watch it here.
One of the more common questions I received after sharing the video was: how did the robots become such great dancers? Well, we now have more information about Atlas and Spot.
Adam Savage and Tested released a video that dives into the RBR50 company‘s Choreographer software. You can watch the video above. As the Boston Dynamics engineer said in the video, the Choreographer software is similar to video editing or animation software. It works by dragging and tweaking Spot’s pre-programmed movements onto a timeline.
The video described some of the movements Spot can perform, including body, step, dynamic transition and kneeling motions. Within each of those categories, there are sub-categories of movements that can be dragged onto the timeline.
One example they go over in the video is the “Running Man” move, which we’ve seen Spot do several times now. “Each move is one step,” said Boston Dynamics. “You adjust the variable for that one step, copy and paste it, drag it into your timeline, and then combine it with other things.”
As for Atlas, learning to dance is even more complex. IEEE Spectrum has a great interview with Aaron Saunders, Boston Dynamics’ VP of Engineering, about the challenges and how human dancers were needed. From IEEE:
“We started by working with dancers and a choreographer to create an initial concept for the dance by composing and assembling a routine. One of the challenges, and probably the core challenge for Atlas in particular, was adjusting human dance moves so that they could be performed on the robot. To do that, we used simulation to rapidly iterate through movement concepts while soliciting feedback from the choreographer to reach behaviors that Atlas had the strength and speed to execute. It was very iterative — they would literally dance out what they wanted us to do, and the engineers would look at the screen and go ‘that would be easy’ or ‘that would be hard’ or ‘that scares me.’ And then we’d have a discussion, try different things in simulation, and make adjustments to find a compatible set of moves that we could execute on Atlas.
“Throughout the project, the time frame for creating those new dance moves got shorter and shorter as we built tools, and as an example, eventually we were able to use that toolchain to create one of Atlas’ ballet moves in just one day, the day before we filmed, and it worked. So it’s not hand-scripted or hand-coded, it’s about having a pipeline that lets you take a diverse set of motions, that you can describe through a variety of different inputs, and push them through and onto the robot.”
Saunders also described how Atlas’ dance moves are controlled: “Atlas’ current dance performance uses a mixture of what we like to call reflexive control, which is a combination of reacting to forces, online and offline trajectory optimization, and model predictive control. We leverage these techniques because they’re a reliable way of unlocking really high performance stuff, and we understand how to wield these tools really well. We haven’t found the end of the road in terms of what we can do with them.”
Arizona State University’s Heni Ben Amor, an assistant professor in the Ira A. Fulton Schools of Engineering, specializes in machine learning and human-robot interaction. He said the technology involved with Atlas dancing isn’t new. But the complexity, in this case, is unparalleled.
“In combination with the artistry of the choreography, this results in an impressive showcase of robot capabilities,” he said.
Ben Amor said mimicking the dancers’ choreography has an element of puppetry to it, but maintaining physical stability is a major part of the equation. “The robot has to think about how to actuate the motors so it can generate its own actions in space.”
“They aren’t making long-term autonomous decisions. They are making short-term decisions that enable them to reproduce what they’ve been shown in the space in which they are acting.”
“Even without any learning or AI, it is still difficult to overstate the quality of this achievement,” Ben Amor said. “A humanoid robot of this complexity, being controlled with this level of fluidity and grace, is unparalleled.”
Ben Amor’s research is machine learning methods that enable physical, human interaction. “I envision that the next stage will be developing stunt robots capable of directly interacting with humans.”
“Spiderman jumping in real life will bring robotics closer to humans,” Ben Amor said. “The day will come when the annual Boston Dynamics release will feature partnering with human dancers and performing graceful ballet lifts.”
Pedro says
This is only cinamtic alghorimitcs combinated with a balance system software
Jaime Oh says
I dont understand how they jump. I know its obviously very complicated, but how is it possible that they jump in the air?
Sree says
Awsm👍👍
Jerry says
Wow!
Teo says
Wow
Paul C Tousley says
Makes me wonder how far away are they from the robots being able to take in what they are seeing and converting that to movement. Cutting the puppeteers strings