The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

Toyota’s New AI Hire Discusses Research at CES 2016

By Eugene Demaitre | January 14, 2016

LAS VEGAS — What do Google and Toyota have in common? Both are interested in artificial intelligence and in the people working on it. Both are spending billions of dollars developing autonomous vehicles, as well as wider applications for AI. And the two tech titans have James Kuffner in common.

Kuffner, an adjunct associate professor at Carnegie Mellon University and the former director of robotics engineering at Google Inc., discussed the potential for artificial intelligence, machine learning, and cloud computing here at the 2016 Consumer Electronics Show.

Steve Crowe, managing editor of Robotics Trends (sister site to Robotics Business Review), introduced Kuffner’s “Smart Robots” session during last week’s Robotics Conference at CES.

Kuffner preferred to discuss his past research rather than his recent move from Google to the new $1 billion Toyota Research Institute. However, Kuffner’s presentation on the state of AI hinted at why Toyota Motor Corp. wanted him.

Historical precedents

Kuffner studied robotics and got his Ph.D. at Stanford University’s Department of Computer Science. He first worked on robotic arms and then developed planning algorithms for humanoid robots.

James Kuffner at CES 2016

James Kuffner talked about his robotics and AI research at the Robotics Conference at CES.

The refinement and mass of adoption of robotics is following a predictable path, said Kuffner. He noted parallels in the development of automobiles, personal computers, and cellphones. Like those technologies, Kuffner said, robots are steadily improving, thanks to Moore’s Law affecting search and deep learning.

He also cited the milestones of Honda’s P2 self-contained robot in 1996, IBM’s Deep Blue supercomputer defeat of chess champion Garry Kasparov that same year, and cheaper hardware today.

A conference attendee asked about Qualcomm and IBM’s work on neuromorphic chips, which are supposed to mimic the human brain.

“Novel hardware could be helpful for quick evaluation of deep neural networks,” Kuffner responded. “Power-efficient, high-performance, embedded compute infrastructure is needed for large deployments of robots in cars.”

The Toyota Research Institute, which has locations near Stanford University and the Massachusetts Institute of Technology, is working on combining cars, communications, and computing, as well as on household robotics.

“Home robots could become ubiquitous and affordable between 2020 and 2030,” Kuffner estimated. “Robots are tools with the potential to help humans live better lives.”

One foot in front of the other
“In the classical model of computation, you have input, a program, and its output,” Kuffner explained. “In robotics, it’s more of a loop between sensing, planning with a digital model, and acting via motor commands in real time.”

Unlike factories, which are structured, predictable environments, modern robots need to be able to move around safely in dynamic environments with previously unknown obstacles, he said.

According to Kuffner, planning for unstructured environments requires the following:

  • Online perception and modeling — one can’t cheat with a priori models
  • Online motion computation — can’t use predefined trajectories
  • Dynamic tasking — can’t rely on predefined sequences

The “research snowball” is a major challenge with autonomous systems, Kuffner said. Trajectory planning is important because autonomous manipulation with a robotic arm, for example, must account for the complicated geometry of the configuration space to avoid collisions.

Kuffner described research in which automatic grasping of objects was modeled thousands of times to evaluate which ones were stable. He also mentioned HERB (Home Exploring Robotic Butler), a collaboration between CMU and Intel Corp. that used the OpenRAVE software and the Robot Operating System.

This has implications for the navigation and safety of self-driving vehicles. Humans can scan the roadway constantly and anticipate changing conditions, but robots must understand three-dimensional space and be able to search and recalculate direction in real time without referring back to preprogrammed situations.

Safety features such as lane assist and adaptive cruise control “are creeping into lots of cars you can buy today, and the pace is increasing,” Kuffner said.

Toyota Taps More Robotics Talent

“We wish to enhance the safety of automobiles with the ultimate goal of creating a car that is incapable of causing a crash, regardless of the skill or condition of the driver,” said Gill Pratt, CEO of the Toyota Research Institute.

Toyota announces its latest AI and robotics researchers

Toyota introduced its new researchers in Las Vegas.

Pratt said that CMU and Google Robotics alumnus James Kuffner will be working on cloud robotics at Toyota. He announced other new hires as part of Toyota’s push into AI, autonomous vehicles, and robotics:
  • Eric Krotkov, former DARPA program manager — chief operating officer
  • Larry Jackel, unmanned ground vehicle researcher at Bell Laboratories and DARPA program manager– to work on machine learning
  • John Leonard, professor of mechanical and ocean engineering at MIT — autonomous driving
  • Hiroshi Okajima, project general manager of Toyota’s R&D Management Division — executive liaison officer
  • Brian Storey, professor of mechanical engineering, Olin College of Engineering — accelerating scientific discovery
  • Russ Tedrake, associate professor in MIT’s Department of Electrical Engineering and Computer Science — simulation and control

Toyota also identified about 30 research projects and project teams, which will begin working in Palo Alto, Calif., and Cambridge, Mass.

In addition, Toyota announced the following members of the research institute’s advisory board:

  • John Roos, former CEO of Wilson Sonsini and former U.S. ambassador to Japan, general Partner at Geodesic Capital, and senior advisor at Centerview Partners — chairman
  • Rodney Brooks, former director of the MIT Computer Science and AI Lab, founder of iRobot Corp. and Rethink Robotics Inc. — deputy chairman
  • Marc Benioff, CEO of Salesforce.com
  • Richard Danzig, former secretary of the U.S. Navy
  • Bran Ferren, former president of research and development at Walt Disney Imagineering and chief creative officer of Applied Minds LLC
  • Noboru Kikuchi, former University of Michigan professor, head of the Toyota Central Research and Development Lab and the Toyota Research Institute
  • Fei-Fei Li, director of the Stanford AI Laboratory
  • Daniela Rus, director of MIT’s Computer Science and AI Laboratory

Google may be pursuing AI and autonomous vehicles on multiple fronts simultaneously, but this roster of luminaries indicates that Toyota has no intention of being left behind.

HERB and humanoid robots such as Honda Motor Co.’s ASIMO are closer to the design expected for household robots. “Footstep planning” is necessary for bipedal robots to efficiently step through uneven terrain or a revolving door, Kuffner explained.

He showed a 2007 video of an augmented reality interface used for route testing — CMU’s ASIMO was able to recalculate the optimal foot placement every 800 milliseconds to follow a changing arc.

“It’s like riding a horse,” Kuffner said. “You point it in a direction, but you don’t supervise each footstep.”

This is also analogous to autonomous vehicles, in which a human would entrust the adjustments for acceleration/deceleration, lane changes, and overall trajectory to the machine.

Do robots manifest handedness? “Our robots are designed to be completely symmetrical,” Kuffner replied to one query. “But a robot should be able to adapt to different ways of walking,” he added, noting that people pass one another differently in Japan than in the U.S., following road conventions.

“It would be great if a robot just knew how to do that,” he said.

Although Kuffner has focused on AI, he also acknowledged the importance of sensors.

“You can have the best planning in the world, but if you can’t sense the world or the state of the robot properly, the robot is useless,” Kuffner stated.

“I’ve worked on indoor robots, where sonar didn’t work well, but then we had LIDAR,” he said. “Outdoors, robots got lost, then we had GPS. Dense, high frame-rate 3-D point clouds are the next thing. ? We need both hardware and software.”

Don’t expect personal assistants too soon

“When are we going to have an intelligent robot personal assistant?” Kuffner asked. “Hollywood is to blame for high expectations, as are researchers who show the only instance of something working.”

According to Kuffner, three things are holding back personal robotics. The first is capability — safety and reliability.

The second is cost, but he said this will change as hardware eventually commoditizes.

The third barrier is the testing burden, Kuffner said. Robots require a complex stack of code, and sensing actuation needs to be tested in real time.

Fortunately, he said, planning algorithms are improving, and cycle times for total footsteps searched are decreasing.

“Home robots may eventually become even more personally prized in the future than cars have been in the past,” predicted Gill Pratt, who left the Defense Advanced Research Projects Agency last year to lead Toyota’s robotics and AI research.

Pratt previously led the DARPA Robotics Challenge and spoke elsewhere at CES.

Brains in the cloud

“Cloud robotics enables cheaper, lighter, smarter robots,” said Kuffner.

“Now we have reliable and scalable computer power and storage that doesn’t have to be stored onboard,” he said. In “thin-client supercomputing,” some of a robot’s functions can be remote, similar to teleoperation of current drones.

“With a physical separation of hardware and software, the data center acts as the brain and controls connected devices,” Kuffner said. One enabling factor is that wireless networking is fast and reliable enough to stream video to smartphones.

The pendulum swings between planning and learning, embedded and server-based apps, said Kuffner. Big data and deep learning are “approaching critical mass” in terms of memory and compute resources, he said, enabling advances in speech recognition, translation, and object recognition.

He listed the following benefits of cloud robotics:

  • A shared knowledge base such as Google Now can organize and unify information about the world in a robot-usable format.
  • By offloading compute-heavy tasks, one can make robots lighter, increase battery life, and worry less about software updates. CPUs can be upgraded invisibly back at the data center.
  • Every robot can be more capable because of a shared library of skills or behaviors.

“There’s no reason why every robot can’t speak every language,” Kuffner said, adding that the Google Translate app is a real-time image-based translation service.

Robots that learn by watching

Children spill beverages all the time, but imagine if a robot could learn how not to do so and share that experience with all other robots, he said. With big data, there could be a hybrid between machine learning about the world and a model-based methodology.

“Just as watching someone play tennis can actually help a human player, we’d like robots to learn from demonstration,” Kuffner said. “With observational learning, we’re just getting started now.”

“Imagine an app store for robots,” Kuffner said. “One could download how to make a souffle, monetize abilities, or even ask like Trinity in The Matrix, ‘I need a helicopter pilot program.’”

Toyota plans to share road information among self-driving cars.

Toyota plans to build maps in the cloud using LIDAR-equipped autonomous vehicles.

Toyota last month invested 1 billion yen ($8.4 million) in Preferred Networks Inc., which is working to apply real-time machine learning to new applications for the Internet of Things, including collision avoidance shown in a concept car at CES.

Human crowdsourcing can scale semantic and quality-control programs such as Wikipedia, but automated “robotsourcing” would involve large-scale deployment of data-sharing robots.

Toyota is working on robot-sourced maps, in which its vehicles would collaborate and update maps. General Motors Co. and others are working on similar technology.

Toyota works to build confidence

A conference attendee asked how Toyota could “reach an assurance level to avoid pathological behavior.” Kuffner acknowledged that testing and validation apply to both model-based and deep learning. “A long, long tail of data is going to be important to be responsible.”

“There’s the ‘training-mile challenge’ — we need a certain number of miles for a data set to train,” he said. “We have good tools for validating deterministic algorithms, but a key challenge is how to cover input spaces.”

“Most of the ways that robots have developed historically, is someone changing a low-level parameter, submitting it to a code repository, but it didn’t get tested.” Kuffner said. “Then, another developer wonders why a high-level behavior is broken and introduces new bug. It’s a huge challenge: How can you have component unit-level testing at scale to have a good, robust system?”

More on Toyota and Self-Driving Cars:

  • Toyota to Spend $1 Billion on Robotics, AI Research
  • Google Mulls Partner for Its Self-Driving Cars
  • Japan Begins Executing on Its Robot Plans
  • Apple Acquisitions Help It Catch Up in AI Race
  • Uber Gives CMU $5.5 Million to Rebuild Self-Driving Research

To have confidence, “we need access to data and corner cases from the real world, and we need the cloud to run regression testing,” he said. “Privacy and security are major issues. … We can use machine learning to anonymize or reduce the privacy risks of sharing data, but there are lots of research challenges.”

If a model is hosted in the cloud and a robot is trained in the cloud, would all results be downloaded from the cloud? asked an attendee. “The caveat is to never safety-critical or latency-sensitive things in the cloud,” said Kuffner. “A robot should behave like a smartphone when it’s disconnected from the network.”

“Humans should program the highest-level behavior; that’s the safe and responsible way to control and debug systems.” Even with self-driving cars, Toyota has said that a human should remain behind the wheel, unlike some concepts which remove it altogether.

“We’re actually closer than people think to having self-driving cars on the road,” he said. “There is a continuous spectrum between full manual control and full autonomous control, and there’s going to be phased deployments.”

 

The final questioner asked if Kuffner would explain the direction of Toyota’s robotics division: “Is it cars or humanoids?”

He laughed and said, “It’s my fourth day on the job; I can’t speak about overall strategy.” Kuffner added that he will be working with both autonomous cars and “partner robots” — Toyota’s term for robots to help the elderly and disabled — since there is cloud potential for both.

About The Author

Eugene Demaitre

Eugene Demaitre is editorial director of the robotics group at WTWH Media. He was senior editor of The Robot Report from 2019 to 2020 and editorial director of Robotics 24/7 from 2020 to 2023. Prior to working at WTWH Media, Demaitre was an editor at BNA (now part of Bloomberg), Computerworld, TechTarget, and Robotics Business Review.

Demaitre has participated in robotics webcasts, podcasts, and conferences worldwide. He has a master's from the George Washington University and lives in the Boston area.

Related Articles Read More >

A robot arm in a manufacturing center with white illustrations on top.
U.S. Air Force gives additional funding to Palladyne AI
Meta V-JEPA2 world model
Meta V-JEPA 2 world model uses raw video to train robots
Developers of the humanoids and cobots shown here can use NVIDIA's safety platform.
NVIDIA Isaac, Omniverse, and Halos to aid European robotics developers
Sergey Levine, co-founder of Physical Intelligence, keynoted RoboBusiness 2024.
RoboBusiness 2025 call for speakers ends June 12

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.
The Robot Report Listing Database

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Sager Electronics and its partners, logos shown here, will exhibit at the 2025 Robotics Summit & Expo. Sager Electronics to exhibit at the Robotics Summit & Expo
  • The Shift in Robotics: How Visual Perception is Separating Winners from the Pack
  • An AutoStore automated storage and retrieval grid. Webinar to provide automated storage and retrieval adoption advice
  • Smaller, tougher devices for evolving demands
  • Modular motors and gearboxes make product development simple
The Robot Report
  • Automated Warehouse
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe