Veo Robotics takes data from multiple sensors to make industrial robots more collaborative, adding value and flexibility to human-machine collaboration in manufacturing.
Collaborative robot arms promise to help small and midsize enterprises by working alongside people more safely than traditional industrial automation, but there are tradeoffs. Cobots are smaller, slower, and less precise. Veo Robotics Inc. says it is “bringing perception and intelligence to industrial robots.”
Waltham, Mass.-based Veo Robotics adds multiple cameras and sensors to a workcell. It combines 3D sensing through lidar with computer vision and artificial intelligence so that larger robots can slow or stop around people but otherwise operate at full speed.
“Of 2.5 million industrial robots, only 13% are cobots,” said Patrick Sobalvarro, president and CEO of Veo Robotics.
“Robots aren’t good at judgment, dexterity, or general sensing, whereas humans are really excellent at that sort of thing,” he told Robotics Business Review during a recent site visit. “What you want is the best of both worlds.”
“Since I had worked at a computer vision company that tracked people, I thought, ‘Why not track the people and have them work together with these big robots?’” Sobalvarro recalled. “When I learned that sensor technology had come along enough to do this sort of thing, I went to Siemens Venture Capital as their first entrepreneur in residence.”
Veo Robotics raised $12 million in a Series A round last year. It was named a company to watch in the 2018 RBR50 report, a promising startup by CNBC, and one of “50 startups to watch” by Built in Boston.
Adding value with humans and robots
“Often in manufacturing, what you’re thinking about is, ‘What is the added value of a process step?’” said Sobalvarro. “So when you make something like a washing machine or a car, you’re bringing things together.”
“Anything you do in a factory that doesn’t actually lead to a transformation of the materials that go into producing the finished part, that can be seen as overhead or maybe even waste,” he said. “That might be moving things around or changing tools. So robots can take care of non-value-added tasks that are nonetheless necessary.”
“The robot can then work in parallel with the human, who is much better at doing the value-added tasks … something as simple as connecting a pair of cables,” he said. “That’s very hard for a robot to do, but a human can do it really easily.”
“Quality control, understanding ways in which a process can be improved — humans are great at that.” Sobalvarro said. “By bringing these two together and allowing people to benefit from the strengths and tirelessness of robots, their repeatability and precision, you get a much more fluid flow in a factory.”
Challenge of customization
“Three trends increase the value of human labor,” Sobalvarro noted. “Mass customization is one. We have lots of customers in durable goods.”
“Everything, from a refrigerator, to a car, to an aircraft engine are being customized on a massive level,” he said. “That leads to some challenges for automation, because you can’t amortize the cost of fully automating the production of a part.”
“Another is the proliferation of products,” added Sobalvarro. “An Amazon distribution center can have 2 million to 400 million items.”
“The third is faster product cycles,” he said. “Innovations for energy efficiency or convenience [lead to] newer ways for building appliances, but they also introduces challenges for mass production. That challenge is best met by manufacturing flexibility, enabled with six-axis arms.”
“The slowest step on an assembly line affects the unit production rate,” said Sobalvarro. “Take, for example, putting an instrument panel into a car frame.”
“I joined Rethink Robotics as the first president there,” he recalled. “One of the first things I did there was get us out to a bunch of customers to learn what people were looking for.”
“Building a car takes 400 steps,” he said. “In such factories, a stop for faults can cost $50,000 per minute.”
The solution is to combine the adaptability of humans with the efficiency of robots, he said. Improvements in AI can enable robots to operate at greater speed and, most importantly, safety.
Safety first with Veo Robotics
“Safe robotics requires perception, intelligence, and actuation,” explained Clara Vu, co-founder and vice president of engineering at Veo Robotics.
“There are three standards that we pay attention to because we’re enabling safe interaction of humans with large machinery,” said Sobalvarro. “On the robot side, there’s ISO 10218-1 and -2; then there’s ISO 15066. ISO TS is still a technical specification. The U.S. version of that is R15.06.”
“Then there’s IEC 61508,” he said, referring to a European standard for functional safety. “We have people who serve on various working committees. It’s great that industrial automation is as safe as it is.”
“The Occupational Safety and Health Administration has recorded 39 injuries involving robots since 1984,” Sobalvarro said. “Compare that with 39,000 per year on the highways.”
“We’re not a safety company; we’re really a productivity and manufacturing flexibility company, and a software company at that,” Sobalvarro said. “But safety is core to the value that we provide, and we participate in all of those standards committees.”
“We partner with everybody,” Sobalvarro said. In its workshop, Veo Robotics has robotic arms from FANUC, KUKA, and Yaskawa.
Veo Robotics made its first public demonstration at the NVIDIA GPU Technology Conference (GTC).
“Companies are seeing value in time savings, going from 45 seconds to 24 seconds,” Sobalvarro said. “Our first product is a base camp, not a fortress.”
Perception and intelligence
“New kinds of workcells can enable new workflows,” Vu said. “By adding cameras and sensors to the work cell, we can gather custom time-of-flight data for depth perception.”
“The cameras are not on the robot for the same reason as why we don’t have eyeballs on our fingertips,” she said. “The eyes and brains don’t need to be on the robots in this case, and we have visibility of the entire workcell.”
“We use off-the-shelf chip sets for time of flight, but we needed a 10-meter range for cameras in an industrial setting,” said Vu. “Multiple cameras feed a computing platform, but we needed a higher level of reliability than what was available.”
“We make our own camera, not because we want to be a camera company — we’re a software company — but because we cannot buy a time-of-flight, safety-rated camera, and we need our systems to be safe. They have to have the necessary redundancy, range, and illumination. We have a set of requirements, and we’d love to see people bring such products to market.”
Veo Robotics takes data from multiple sensors, runs sensor fusion and 3D analysis on it, and classifies objects in the workspace. It then informs the robot whether it can safely move, reducing the need for safety cages.
“Mobile robots represent a simpler problem of X and Y coordinates, but these robots have six degrees of freedom like human limbs and operate in three dimensions,” said Vu. “A robot is only one part of a workcell.”
“The hardware is getting cheaper, thanks to driverless car development,” Vu said. “Our requirements for computer vision are the inverse of gaming. We’re benefiting from parallelized CPUs and GPUs.”
Engineers as customers
Companies are considering collaborative robots for three situations, explained Vu. The first is where they’re adding automation, the second is where they need flexibility, and the third is when they are building new production lines, she said.
“Venture capital investors often ask what market we serve,” Sobalvarro said. “Our customers aren’t the people who design car models or worry about air conditioners; they’re the people who make the factories that make those things.”
“Our customers are engineers, who are very creative,” he added. “They’re constantly modifying their processes and can account for variations in scale and improve quality. There’s all kinds of ways in which these factories are constantly changing. We help them be more flexible.”
“Industrial automation in general benefits from a very capable systems integrator channel,” Sobalvarro said. “They are often specialized and can improve process steps.”
“Both of those groups, whether they’re internal manufacturing or automation engineers or they’re external integrators, would buy our products and install them into factories.”
“We have a vision for human-machine optimization and the transformation of factories,” he added. “We want safe interactions between humans and all kinds of machines.”