To achieve maximum throughput in robotic piece picking, it’s not simply a matter of adding machines to the line. Robotics designers, suppliers, integrators, and users need to identify the best combination of robot arms, sensors, and end effectors for a particular payload or task. In addition, many robots need the right intelligence and ability to choose the right grippers. XYZ Robotics Inc. is an example of a company that has developed systems to address these needs.
Allston, Mass.-based XYZ’s piece picking system uses machine vision, but it does not rely exclusively on artificial intelligence models. A combination of mechanical and machine learning approaches is necessary, said Peter Yu, chief technology officer at the startup.
“With both [approaches] and our tool changer, a robot can pick nearly anything, which is useful in logistics and manufacturing,” he told The Robot Report. “AI is important for tool selection. Changing between a large cup to a small cup or a bag cup gripper — that’s a challenge from both the tool side and the AI side.”
XYZ Robotics’ grippers pick consumer electronics, apparel, cosmetics, and other objects for e-commerce order fulfillment. With AI guidance and the ability to change end-of-arm tooling, one robot can handle a wide variety of items with speed and accuracy.
“For example, if the SKU is a plastic bag, our system will know and choose a suction cup to pick it up,” Yu said. “But if it’s mesh or a thin pencil or screwdriver, there’s not much area, so the robot can choose a two-fingered gripper.”
Choosing the right grippers for piece picking
XYZ’s vision-guided tool changer can swap out end effectors in about half a second. “For vision, the time to identify the piece picking points is 0.1 sec. with VGA, and at 720p, it is 0.3 sec.,” Yu said. “In addition, the tool changer has locating pins so the robot knows the gripper is engaged in a specific orientation.”
XYZ Robotics uses a combination of standard and custom end effectors with its tool changer. “A suction cup may come from off-the-shelf vendors, but the bag cup is designed, made, and patented by us, as well as the tool changer,” said Yu.
“There are many kinds of grippers — some, like Schunk, are electric, and others are pneumatic, like SMC’s,” he added. “Usually, we use vacuum to do suction. We want to use the same source to drive everything. That’s why we have our grippers driven by vacuum, so we don’t need to add lines for electricity or compressed air.”
“We’re still working on a two-fingered soft vacuum gripper for picking items that are not graspable by suction cups, such as lipsticks, measuring spoons, and screwdrivers,” Yu said. “We want to provide a holistic piece-picking solution to our customers.”
Related content: See the February 2020 issue of The Robot Report for more on grippers and end-of-arm tooling.
Engineering reduces the need for big data
Machine learning typically needs large, clean data sets, and humans play a major role in annotating training data, Yu acknowledged.
“Everyone needs data and observation of that data,” he said. “Our overall approach was statistical-based machine learning, but we don’t throw raw data to the algorithms to train it. That’s an end-to-end approach — it’s cool, but it requires a bigger magnitude of data.”
“Machine learning scientists come in, as humans give some heuristics to model, which then needs less data to train,” Yu said. “For example, for piece picking, intuition is that we observe the geographical features to grasp. Context helps help reduce complexity.”
How long does it take to train a piece picking robot on a novel object? “The algorithm generalizes on most novel objects,” Yu said. “But if certain objects need a specific tool or to be grasped in a certain way, we need to teach the robot, and it can learn pretty fast — one hour after inputting the data.”
“There are more ways, like self-supervised learning,” he added. “We throw in an item, and the robot tries different tools and puts item at different locations. If we let the robot explore, it could take five minutes. Then the label is fed into the training algorithm, and it takes an hour or so to get a new model.”
By taking multiple approaches, robotic grasping can improve over time, said Yu. “Before, we started at 80% of items that were graspable or suctionable by our tools,” he recalled. “It was an engineering effort to push from 90% to 99%. That was some innovative engineering going on, which means a lot when translating to accuracy, reliability, and speed.”
Solving the double-picking problem
“Accurately detecting double picking — that problem is hard in industry,” Yu noted. “If two boxes with little textures are tightly packed together, it’s difficult for a vision system to see the seam in between. We take a combined approach using vision and weight sensing.”
“In the e-commerce space, a customer might otherwise get two iPhones. The rate of double picking is typically 1% of items,” explained Yu. “Using the combined verification approach, we can avoid 95% of that 1%, so the overall rate of double piece picking goes down to 0.05%.”
Yu said that XYZ is constantly pushing its technology for more accuracy, reliability, and cost savings. “We keep pushing speed for the tool changer, and broadening the variety of items we want to pick is a big thing,” he said.
“Accuracy is important to reducing double picking, and we’re working on the motion of the robot to make it faster,” said Yu. “We’re pushing the system this year to do 1,200 picks per hour. To meet our goals, we have to incorporate everything.”
Remote assistance and 5G
Like other robotics companies, XYZ expects to offer remote assistance to help deal with rare cases. “Our method is machine learning-based,” Yu said. “Sometimes, if an item has an odd shape, a vision system will not work well. A robot can send a request for help to the cloud, and a human monitor can see what’s on pick point.”
Another aspect of remote assistance is the ability to train robots in less time, thanks to 5G. “If we know a certain item isn’t performing well, the system will collect data and then go training to improve grasping within half a day and send to robots,” said Yu. “We already bought a 5G access point in China and have started testing it. Speed improves 10 times in general, usually in urban areas. Most areas have only 4G access, and the speed may be affected a bit.”
Connecting with the rest of the warehouse
XYZ Robotics’ piece-picking and sorting system can connect with automated storage and retrieval systems (AS/RS) and place items on a shelf, on a conveyor belt, or in a box on an automated guided vehicle (AGV). Machine vision is also relevant to bar-code scanning, noted Yu.
“For warehouse applications, XYZ’s system is connected with the WCS [warehouse control system] to keep track of all the orders.”
“In terms of piece picking, the capability for the robot depends on visual data. The most data shared is the vision data, which is collected, and then the model is deployed to all the robots.”
“We do not share that much data with the AS/RS or AGVs, which are under the control of the WCS. It controls all the components, including robots.”
“On the business side, we’re working on partnering with AS/RS systems and AGV vendors,” Yu said. “We’re also working with partners like SF Express and JD.com.”
Demand for vision-guided piece picking
Although many startups are chasing the piece-picking market, Yu said they need to focus on solving the right problems.
“When we go to a customer sites and ask them what they need — there’s no product in the world that has really made a huge impact yet because humans are still pretty good in terms of speed, accuracy, and variety,” he said. “There is rarely a huge deployment of piece picking robots. My company feels that the competition is with human dexterity, not other companies.”
XYZ said it is gradually reducing the need for fine-tuning the model for specific products from specific customers. “Usually when we deploy, our normal model works out of the box,” Yu said. “But as we collect more data, we can better solve various rare situations.”
Buck Crowley says
Let’s see them pick a single piece of paper off s pile. What printers have been doing for years. Then they will understand picking