The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

Space exploration, robot partners require new algorithms for safety, says IEEE Fellow

By Eugene Demaitre | November 5, 2020

Georgia Tech IEEE

A Tigershark unmanned aerial vehicle. Prof. Panos Tsiotras says better controls will lead to more autonomy. Source: Georgia Tech

As on Earth, robots are increasingly seen as necessary aides to humans in space exploration. Not only are unmanned probes exploring the moon, other planets, and beyond, but autonomous systems are also expected to be partners in helping people maintain systems and return to the moon. To do these things safely, new algorithms must be developed, said Panagiotis Tsiotras, a professor at the Georgia Institute of Technology.

Tsiotras, who is an IEEE Fellow and the David and Andrew Lewis Chair of the Guggenheim School of Aerospace Engineering at Georgia Tech, is working on fail-safe integration of machine learning and related components with aerospace systems. He is working to improve perception so that space-based and other robots can have better situational awareness. In addition, Tsiotras is researching optimized controls so that robots can better identify paths.

“My research in past few years has been in autonomous systems for ground, aerial, and space applications,” he toldĀ The Robot Report. “The problems are the same for all of them — trying to develop strong decision-making controls.”

Applying lessons from autonomous vehicles

“With autonomous vehicles, we’re using the techniques of others. I’m interested in how can we make vehicles behave more naturally in traffic,” said Tsiotras. “It’s easy to get in a vehicle that’s safe, but it does not feel natural. Self-driving cars may go slowly or keep a lot of distance from other vehicles. People have different styles and desires.”

Panos Tsiotras

Prof. Panos Tsiotras. Source: Georgia Tech

“We want to make vehicles behave more naturally or closer to that of human drivers through driver cloning,” he said. “But one challenge is that everyone thinks they drive well.”

“Determining intent is important not just for detecting behavior and planning, but also for higher-level thinking,” Tsiotras expained. “Some self-driving cars might be more aggressive — without breaking any laws. Sometimes detecting driver intent is easy, if people indicate it with turn signals. Sometimes they don’t, and at an intersection with a flashing light, people may make eye contact, nod, or wave hands to indicate who should merge. Autonomous vehicles should also be able to see that and figure it out.”

Making vehicles and robots behave more like humans is important in mixed environments, he said. “In the future, if all vehicles are autonomous and have a way of ‘handshaking’ by network, then the problem would be solved,” said Tsiotras. “To figure out how a person drives in his or her own vehicle, systems can observe and adjust their own behavior accordingly. It can learn if someone is timid or aggressive, it can determine how much to intervene and compensate as needed.”

AI could lead to customized self-driving cars, says Tsiotras

The next step in developing and applying artificial intelligence to vehicles is to account for manufacturers’ requirements and consumer perceptions, Tsiotras said.

“Automotive manufacturers sometimes have features that are hidden from manufacturers, such as drive by wire and traction control,” he noted. “If they work too well, people won’t even know they’re there. Transparency is good but can be difficult for the business case.”

“We’ve started investigating this. It’s not clear whether people would do the same thing as passengers in intelligent vehicles as they would as drivers who know what they’re doing,” added Tsiotras. “That’s where driver cloning can help.”

“The idea is that the vehicle is always observing how someone is driving so it has the information and is able to detect whether the driver is not behaving properly in a particular situation. Perhaps they’re tired or having a bad day,” he said. “This would be a more proactive version of driver assist for an extra level of safety. Toyota uses the term ‘guardian angel under the hood.'”

But how close are today’s vehicles to such autonomous and advanced driver-assist systems (ADAS)? “We’re not even close to compete autonomy,” Tsiotras replied. “Such systems are slowly getting into vehicles. Trucking on highways is the easier problem, but we want the same level of autonomy in a downtown business district, where there are pedestrians, construction, and bicycles.”


Robots in space exploration

Another aerospace project that Tsiotras is working on for NASA involves detecting asteroids. “There is a lot of interest in small bodies for mining, and we’re sending spacecraft without astronauts to observe,” he said. “They have to be autonomous robots because tele-operation is not an option because of the distance.”

Unmanned probes have been able to perceive asteroids and identify places to land. They could autonomously use cameras and other sensors to determine their properties such as shape and mass, Tsiotras said.

“We won’t see completely autonomous controls — we’re adding increasing levels for different phases of a mission,” he said. “Take, for example, sending a robot on Mars to collect some rocks. Typically, a human operator on Earth would get pictures of a rock on the horizon and tell the robot where to go. After the signal delay, the robot would go find it and collect it. It could autonomously navigate the terrain and decide how to achieve its goal. At another level, the operator could just say, ‘Go find interesting geological formations.'”

“Another example is you could have a level of deciding when and where to land. Pinpoint landing on Mars must compensate for uncertainties in atmospheric entry, which is important for future missions in which we’ll send supplies ahead of humans,” said Tsiotras. “We could choose a general area in the desert and maybe have the lander’s cameras evaluate landing zones and autonomously do a lateral diversion to avoid rocks, as opposed to landing with balloons. Proximity is important for human landing zones, and a geologist might want to land a robotic rover near a certain formation.”


Tsiotras considers flying cars and robotaxis

Tsiotras’ research into decision making for autonomous systems also applies to aerial taxis, which are being developed worldwide.

“The main issue with these applications for connecting within metropolitan areas is that they need to be reliable. Because they’re operating in environments with humans rather than on Mars, the stakes are much higher,” he said. “It’s not just precision but also robustness and reliability, so you need to have redundant sensors and actuators, and integration is a challenge.”

“Like autonomous ground vehicles, these systems have to operate under many weather conditions,” observed Tsiotras. “Robots and drones are increasingly different from previous generations on factory floors. Systems have to adapt and learn on their own because nobody can pre-program them for every eventuality.”


Better-than-human performance expected

“The question is, at what point will the public accept loss of human life from machine error?” Tsiotras said. “Should a robot or autonomous vehicle be as good as a human, or 10 times better? We have 30,000 automotive deaths per year in the U.S. today, but if an autonomous vehicle kills one human, it makes the news. It’s a tall order, trying to make machines superhuman for safety, but developers must be very careful, or the public will turn against this technology.”

“Fortunately, machines are pretty good at pattern recognition,” he said. “With the right sensors and machine learning algorithms, autonomous systems can recognize a bicycle, a soccer ball, or a child running after that ball. It’s really about context and judgment, which aren’t so easy. This is related to what I was saying about detecting human intent.”

“Humans take years of experience to walk or drive well; we may have to let machines mature and observe the world for some time,” said Tsiotras. “Most automakers and technology companies are working on Level 3 autonomy right now. In my opinion, they need to go in some more structured steps, such as long-haul transportation or HOV-type [high-occupancy vehicle] lanes.”

“Just make sure self-driving cars can operate reliably in large numbers,” he added. “A more prudent approach to different environments and conditions, such as nighttime or a blizzard, would be useful. People like to drive but get bored on highways.”

“It’s a very exciting time — I envy my students,” Tsiotras concluded. “They’re always complaining that things seem difficult, but these are cool problems. Enabling technologies and talent are coming together to attack problems that seemed unsolvable 15 years ago. With the coalescence of control and signal theory, processing, AI, and modern robotics, it’s a good time to be a student.”

About The Author

Eugene Demaitre

Eugene Demaitre is editorial director of the robotics group at WTWH Media. He was senior editor of The Robot Report from 2019 to 2020 and editorial director of Robotics 24/7 from 2020 to 2023. Prior to working at WTWH Media, Demaitre was an editor at BNA (now part of Bloomberg), Computerworld, TechTarget, and Robotics Business Review.

Demaitre has participated in robotics webcasts, podcasts, and conferences worldwide. He has a master's from the George Washington University and lives in the Boston area.

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

The Northeastern team that won the MassRobotics Form & Function Challenge.
Northeastern soft robotic arm wins MassRobotics Form & Function Challenge at Robotics Summit
A FANUC robot working in car manufacturing.
U.S. automotive industry increased robot installations by 10% in 2024
A robot arm with a two-fingered gripper picking up a cup next to a sink.
Cornell University teaches robots new tasks from how-to videos in just 30 minutes
A comparison shot shows the relative size of the current RoboBee platform with a penny, a previous iteration of the RoboBee, and a crane fly.
Harvard equips its RoboBee with crane fly-inspired landing gear

RBR50 Innovation Awards

ā€œrr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.
The Robot Report Listing Database

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Sager Electronics and its partners, logos shown here, will exhibit at the 2025 Robotics Summit & Expo. Sager Electronics to exhibit at the Robotics Summit & Expo
  • The Shift in Robotics: How Visual Perception is Separating Winners from the Pack
  • An AutoStore automated storage and retrieval grid. Webinar to provide automated storage and retrieval adoption advice
  • Smaller, tougher devices for evolving demands
  • Modular motors and gearboxes make product development simple
The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe