Covariant announced today that it has raised $40 million in Series B funding, which brings its total funding to date to $67 million. The Berkeley, Calif.-based company said it plans to use the money to introduce “AI Robotics” to new industries, accelerate its partnerships, and grow its staff.
“As the coronavirus crisis has exposed serious frailty in the global supply chain, we’re seeing more demand than ever for our AI Robotics solutions,” stated Peter Chen, co-founder and CEO of Covariant. “Our customers are eager to invest in AI and scale it across their supply chains to meet growing demands and more stringent requirements. This latest funding round, along with our recent partnerships, will allow us to scale quickly across multiple industries.”
“We’ve been working with a single focus on a universal AI for robotics,” Chen told The Robot Report. “Lots of companies are building robots, but they need a platform that can adapt and solve lots of manipulation challenges. We’re starting with high-automation warehouses as a beachhead market. Logistics and supply chain operations are dealing with labor shortages and repetitive work, and the state of the art is just not good enough.”
Covariant was founded in 2017 by AI researchers and roboticists from the University of California, Berkeley, and OpenAI. It has been developing the Covariant Brain, which it described as a “universal AI that enables robots to see, reason, and act autonomously in the real world.” The company is focusing on giving robots the ability to manipulate objects they have not seen before and to operate in new environments.
Covariant achieves one-hour mean unassisted operating time
In January 2020, Covariant launched from stealth and said that its AI Robotics workstations had achieved the ability to run in production for more than an hour autonomously without any human intervention.
“When we founded Covariant, our goal was to make AI Robotics work autonomously in the real world,” said Pieter Abbeel, co-founder, chief scientist, and president of Covariant, as well as director of the Berkeley Robot Learning Lab and co-director of the Berkeley AI Research Lab. “Having reached that milestone, we see a huge benefit in expanding our universal AI to new use cases, customer environments, and industries.”
“Everybody talks about artificial intelligence, but even with state-of-the-art cameras, object-recognition open libraries for training neural networks, and integration, systems are not robust or versatile enough for long-tail use cases,” Chen said. “We looked at the literature and realized that we need more AI research to push the boundaries of what’s possible.”
“We have developed everything in house, from image-analysis libraries and models to motion planning,” he said. “Covariant has about 50 people, and we don’t use ROS [the Robot Operating System], even though most engineers start with it. It’s easier to iterate quickly or make code robust than when you inherit a big code base. It’s our engineering culture to have more control.”
Covariant demonstrates progress in AI, robotics
“Third-party benchmarks for robotic manipulation are similar to those for self-driving cars,” said Chen. “It’s not like a 30-to-60-second demo on a website. Does a system need a lot of babysitting? Can it continually adjust to new scenarios?”
“In the middle of last year, ABB hosted a competition with 26 different use cases [involving] picking requirements across industries, from groceries to apparel and pharmaceuticals,” he recalled. “It invited 20 companies and told us half of the use cases in advance.”
“[The judges] then showed up to your place — on your home turf — and then gave the remaining challenges,” Chen said. “The most striking thing that came out of that competition was that most companies had deadlocks, like picking an apple from the corner of a bin or introducing expensive errors like picking up two iPhones at once.”
“Covariant was the only one that could solve all 26 cases autonomously,” he added. “If you look at traditional automation settings like car manufacturing, robots barely need human help. However, the degree of variability in e-commerce means that one might need to intervene every 10 minutes or so and can only oversee a few robots rather than 10 to 20 robots.”
Pandemic poses challenges to fundraising, testing
Index Ventures led Covariant’s Series B round, with participation from existing investor Amplify Partners and new investors including Radical Ventures. Mike Volpi, a partner at Index Ventures, will be joining the company‘s board of directors.
“Making the deal during the COVID-19 pandemic was definitely a challenge,” said Chen. “We started the process one or two weeks before it hit, and a big part of the fundraising happened during the crisis. Not being able to talk to people in person was a challenge, and there has been a lot of uncertainty about how markets would be affected.
“Fortunately, the fundamentals of what we do were so strong that people understand the need for more advanced, autonomous robotics,” he said. “It was extremely evident with supply chain vulnerabilities and the need for spacing [human workers].”
Covariant has been able to conduct a lot of software testing remotely or in simulation, Chen noted.
Covariant grows partnerships, sets sights on markets
In February, Covariant announced a partnership with leading industrial automation supplier ABB. In March, Covariant joined forces with Knapp, a leading supplier of intralogistics systems.
“There’s more to motion when we think of how the brain operates,” said Chen. “It’s not just the software and the hardware arm. We need to create a more functional system, incorporating gripper design, he conveyor, and all of the surroundings. We’re designing our software to be friendly for our partners to build complete robotic work cells and to use that same software for different applications.”
“We have two kinds of customers — distribution center operations and more pure e-commerce fulfillment settings,” said Chen. “They have different characteristics, such as higher throughput. In e-commerce, robots typically need to perceive and manipulate a wider range of objects. We’ve been testing with customer sites that are still running.”
Tell Us What You Think!