Our recent robotics trends webinar included three great speakers. Here’s the presentation from Ilia Baranov, a Senior Electrical Designer at Clearpath Robotics in Canada. He’s active within the open source robotics community and he is also the creator of the popular ROS 101 blog series. To watch the webinar on demand, simply visit: http://www.designworldonline.com/innovative-trends-in-robotics/
Today I’d like to talk about some trends as we at Clearpath Robotics have seen them in RND Robotics.
A little bit of background about Clearpath: We started about six years ago, four Mechatronic students from the University of Waterloo. After the graduation, they were looking for something to do and they decided to make mine sweeping robots and after a little while, they realized that unfortunately, nobody was willing to buy mine sweeping robots from four students in a garage so they transferred over to research and industrial products and that has gone very, very well for us.
Clearpath Robotics currently employs over a 100 people, both at our Kitchener facility, which is just outside of Toronto in Canada and also in a San Francisco branch office. We provide hardware, software integration for robotics, both for research and applications. Our motto is to automate the world’s dullest, dirtiest, and most dangerous jobs, so things that we don’t really want humans to do.
We predominantly use ROS, so that’s the Robot Operating System, and what ROS allows us to do is it allows us to communicate between the on-board computer, between different sensors, between our hardware platform in a very agnostic way. For a researcher, when they get our platform, they open the box, they power on the robot, and they can drive it out and immediately start doing their research.
On top of that, ROS allows many people to contribute code back to the community so we ourselves have written maybe over 200 open source software packages but the community as a whole have written over 10,000 and so that allows us to employ sensors without having to really know the inner workings. We can just plug them in and use them and allows our customers to do the same.
One prime example of this, you can see on the right there, there’s the Husky robot and that specific robot was developed for the Canadian Space Agency. The Space Agency was looking to developing a Mars Rover Analogue. Instead of spending several millions of dollars and several years developing a Mars Rover, they buy one of our units, they mount sensors on them and they could immediately start doing their research at a fraction of the cost.
Where does this lead us? As an engineer, I look at the world in a little bit of a simplistic view and so pretty much any engineering solution out there, you’re going to have either cost, flexibility or performance, you can pick any two of them.
When Clearpath Robotics started out, we were really focusing on the students and students are really focused on low cost and high flexibility robotics and at the price of performance. For example, our TurtleBot is about a $2,000 development platform. It’s pretty slow, not very powerful processor but allows you to do many things as a student.
Second, we expanded into the research market and the researchers really want to have high performance, very flexible systems and they’re not as cost sensitive, so they can spend a bit more money, get something that lasts for five, 10 years and really lets them dig into their research.
Lastly, something that Clearpath Robotics have been expanding into is the industrial market. The industrial market is an interesting place because you have really high performance demands, very fast, very precise, very high up time, very reliable but they want it at a low cost. Inevitably, the trade-off has always been that you have a very precise machine that only does one thing. Your auto steel welder just welds steel, it doesn’t make pancakes.
That trade-off has always been the way, so what Clearpath Robotics is trying to do is we’re trying to do just a little bit more flexibility. As I’ll show in these research trends and industrial trends, you’ll see that most people are now starting to trade a little bit of more flexibility for a little bit less performance and getting a better overall product.
Now one thing to note, in the center there, you’d love to have a product that’s both cheap and very flexible and very performant but in reality that usually doesn’t happen unless you wait several decades. A good example of this is laptops. Ten years ago, the laptop that I’m using now would have been more expensive and slower and less flexible. Really researchers right now, they have a choice of they wait a decade or they can work with a company like us and get a solution that’s targeted to exactly what they want to do right away.
Some of the specific trends that we see is we see that in hardware, a lot of robots are starting to move away from the simple line following, the simple path finding bumping into things like a robotic vacuum cleaner. We’re starting to see things like USB cameras, very simple cheap USB cameras being used for very complex vision navigation. We see Lidar being used for mapping and Depth sensors.
Another really interesting one is Depth sensors, when Microsoft put out their Microsoft connect, they had assumed that people would buy it to use it as a gaming peripheral but in reality it’s been this huge boon to the robotics market because here’s this $400 sensor which does only one thing. It makes 3D point clouds but it does it so well and it does it so cheaply that it beats out pretty much any $10,000+ services out there. Because of that, both Microsoft and Apple and Intel and almost any large semiconductor player you can name, they’ve all started to put lots and lots of money into Depth sensors.
With other interesting trends that we for the first time are seeing computing as both fast enough and cheap enough to put into public facing robots. A good example of this is the image there, the Savioke robot. This is a robot that goes around hotels and delivers towels and if you think about it, 10 years ago, it’d have been incredible to see robot navigate its own way around a hotel, find its way to a doorway, just for something as simple as delivering a towel.
Now the cost is low enough and the performance is high enough that if one of these robots gets lost or somebody decides to vandalize it, it’s not that big of a loss. This is all enabled pretty much through general purpose hardware so you have a Depth sensor that’s not only good at one thing but it does many things and same thing with a computer, same thing with the platforms themselves.
The software side things, as I mentioned, ROS has been a big thing. Having an open source framework where many robots can talk to each other, sensors can share data and researchers can share their data between themselves, that’s made the open source solutions win out.
Closed-source, for example, the old Microsoft robotics studio wasn’t very popular because even if you develop something really novel you can’t really explain how it works because all of your sources are either closed-source or depends on some hidden library that nobody really understands.
In general, software is using a lot less structured spaces. Here you can see is a Husky robot using a 3D Velodyne Lidar and it’s going through an office space. There’s no markers on the walls, there’s no special tape, there’s nothing. It just using the vision system, it can navigate it’s way around a very complex environment, an environment that office space or a warehouse.
Vision is a large area of research. Here’s one of our researchers using the Husky and you can see it’s festooned with cameras facing in every direction and this researcher’s trying to use camera vision only to do navigation both underground and above ground and he’s double checking his information with sensors like GPS and Lidar on the front.
Another big one is the human-robot interaction. I head up the team that runs the maintenance for the PR2 robots. That’s the robot that you see there. It’s about a $200,000 robot and it does human interactions. For example, in this case, this man is paralyzed and the robot helps him do basic tasks and they’re doing research into how do we make a robot understand the intent of a human being? How do we make the robot behave properly around a human being, not injure the person but also work at a reasonable speed?
That also actually shows manipulation. For example, in the last frame, we have the robot picking up a flexible towel, which to humans seems like a very simple thing but for a robot is extremely complex, so that’s a very cutting edge research end.
The last one is a unstructured path planning. Without telling the robot exactly how to get from Point A to Point B, I just expect to come up to my PR2 and say, “Go to the kitchen and get me a coffee,” and it should be able to do that. That kind of cutting edge research is what we’re seeing right now.
Here’s an interesting thing about ROS. We did a search in the IEEE Xplore Database, it’s a big database of all the published papers out there and you can see that ROS is gaining mindshare year on year on year at a unprecendented rate. It’s quickly becoming one of the most used pieces of software. Things like MATLAB have historically been very popular but again because of their closed-source nature, they’ve started to lose share there a little bit.
Lastly, I just want to talk about, again, harping this idea of flexibility, when a researcher comes to us and they say, “I want to do research in agriculture,” then we’d use GPS, we might use stereo cameras. Depth sensors don’t work outside, so we can’t use those, so we mix and match sensors until we find the ideal robot. Whereas if you’re working underground, can’t use GPS, you might want to use 3D vision, you might want to use Lidars and so on.
This customization and flexibility is really where research robotics is going. Instead of having one specific thing, you have a robot that can do many things at once.
Overall, what are we seeing? We’re seeing that the traditional robots on the left, for example, that’s a classic robot that follows a piece of tape on the floor. If somebody jumps in front of it, it just stops. It’s very simple. It’s not very flexible and it’s quite expensive. That development is transitioning over to research robotics.
For example, we have a Baxter there and one of our Clearpath Ridgebacks, those two robots communicate together. They collaborate so that if the Baxter needs to reach something that’s outside of its grasp, the Ridgeback will move the robot over, so that’s a research phase. We’re hoping that all of those research will eventually go back to manufacturing, so instead of following tape on the floor, the robot drives to where it needs to be, picks up the tool, and goes, does its work.
asma says
hello,
I am a beginner in ROS and I am trying to connect a velodyne sensor to the husky robot and simulate it using gazebo. Please may you help me to run this task.