The DARPA Robotics Challenge Finals was a compilation of some of the most sophisticated robots in the world. The challenge was to see how semi-autonomous robots would perform in a simulated disaster environment.
However, only three out of the 23 teams managed to complete all eight tasks, including driving and exiting a vehicle, opening and going through a door, locating and opening a valve, using a tool to cut a hole in a wall, removing an electrical plug from a socket and putting it in a different socket, traversing rubble and climbing stairs.
SEE ALSO: 14 Epic Fails from the DARPA Robotics Challenge Finals
So why all the struggles? Well, DARPA deliberately degraded communications (low bandwidth, high latency, intermittent connection) during the challenge to truly see how a human-robot team could collaborate in a Fukushima-type disaster. And there was no standard set for how a human-robot interface would work. So, some worked better (much better) than others.
Here’s a look at how some of the teams handled the human-robot collaboration and level of autonomy:
The winning DRC-Hubo robot used custom software designed by Team KAIST that was engineered to perform in an environment with low bandwidth. It also used the Xenomai real-time operating system for Linux and a customized motion control framework.
The second-place finisher, Team IHMC, used a sliding scale of autonomy that allowed a human operator to take control when the robot seemed stumped or if the robot knew it would run into problems.

Carnegie Mellon’s Tartan Rescue Team, which finished third, used a similar method to Team IHMC, according to team leader Tony Stentz. “The real advancement here is the robots and the humans working together to do something,” says Stentz tells MIT Technology Review. “The robot does what the robot’s good at, and the human does what the human is good at.”
SEE ALSO: 7 Things We Learned After the DARPA Robotics Challenge Finals
Team Nimbro, which finished 4th, went with more direct control for its Momaro robot. There were 9 people controlling the robot during different tasks. One human operator even used an Oculus Rift virtual reality headset and gesture tracking to control Momaro.
Team MIT, which finished seventh, designed its Helios robot to be very autonomous. The team’s human operators could point to a lever, but Helios would execute its own gameplan from there on out. However, if there was an issue or if a human operator wanted to take over, that was certainly possible.
Many of the robots also struggled to grasp objects and use them properly. As MIT points out, this is a deficiency in machine vision and manipulation. “Robot sensors struggle to see shapes accurately in the kind of variable lighting found outside, and robot hands or grippers lack the delicate, compliant touch of human digits.”
SEE ALSO: DARPA Robotics Challenge: Meet the Finalists
“I think this is an opportunity for everybody to see how hard robotics really is,” says Marc Raibert, founder of Boston Dynamics.
DARPA program manager Dr. Gill Pratt said “these things are incredibly dumb. They’re mostly just puppets.”