What would you get if you combined Apple’s Siri and Amazon’s Alexa with Boston Dynamic’s quadruped robots? You’d get “Astro,” the four-legged seeing and hearing intelligent robot dog.
Using deep learning and artificial intelligence (AI), scientists from Florida Atlantic University’s Machine Perception and Cognitive Robotics Laboratory (MPCR) in the Center for Complex Systems and Brain Sciences in FAU’s Charles E. Schmidt College of Science are bringing to life one of about a handful of these quadruped robots in the world. Astro is unique because he is the only one of these robots with a head, 3D printed to resemble a Doberman pinscher, that contains a (computerized) brain.
Astro not only looks like a dog; he learns like one too. That’s because he doesn’t operate based on pre-programmed robotic automation. Instead, Astro is being trained using inputs to a deep neural network – a computerized simulation of a brain – so that he can learn from experience to perform human-like tasks, or on his case, “doggie-like” tasks, that benefit humanity.
Equipped with sensors, high-tech radar imaging, cameras and a directional microphone, this 100-pound robot dog is still a “puppy-in-training.” Just like a regular dog, he responds to commands such as “sit,” “stand” and “lie down.” Eventually, he will be able to understand and respond to hand signals, detect different colors, comprehend many languages, coordinate his efforts with drones, distinguish human faces, and even recognize other dogs.
What the robot dog will be used for
As an information scout, Astro’s key missions will include detecting guns, explosives and gun residue to assist police, the military, and security personnel. This robot dog’s talents won’t just end there, he also can be programmed to assist as a service dog for the visually impaired or to provide medical diagnostic monitoring. The MPCR team also is training Astro to serve as a first responder for search and rescue missions such as hurricane reconnaissance as well as military maneuvers.
Martin Woodall, founder of DroneData and a visionary in the field of GPU accelerated servers, is sponsoring the robotic project at FAU. This project is a collaboration between DroneData AstroRobotics division and FAU’s MPCR laboratory. AstroRobotics has provided the GhostRobotics open source robotic Four Leg All-Terrain Edge Compute Platform for the FAU designed software.
Designed to engage and react to the world around him in real-time, this robot dog will be able to navigate through rough terrains and respond to dangerous situations to keep humans and animals out of harm’s way. Astro will be outfitted with more than a dozen sensors that will consume environmental input across multiple modalities, including optical, sound, gas and even radar.
The Robot Report has launched the Healthcare Robotics Engineering Forum, which will be on Dec. 9-10 in Santa Clara, Calif. The conference and expo focuses on improving the design, development and manufacture of next-generation healthcare robots. Learn more about the Healthcare Robotics Engineering Forum.
NVIDIA Jetson TX2 GPUs help process data
To process the sensory inputs and make autonomous behavioral decisions, a set of NVIDIA Jetson TX2 GPUs are onboard the robot dog with a combined four teraflops of computing power, which amounts to about four trillion computations a second. This robot dog will be able to rapidly see and search thousands of faces in a database, smell the air to detect foreign substances, and hear and respond to distress calls that fall outside a human’s audible hearing range. FAU’s MPCR team will program Astro to have an extensive database of experiences that he can draw upon to help him make immediate decisions on the go.
The human brains behind Astro are a team of neuroscientists, IT experts, artists, biologists, psychologists, high school students and undergraduate and graduate students at FAU. At the helm of this project are Elan Barenholtz, Ph.D., an associate professor in FAU’s Department of Psychology, co-director of FAU’s MPCR laboratory, and a member of FAU’s Brain Institute (I-BRAIN), one of the university’s four research pillars; William Hahn, Ph.D., an assistant professor in FAU’s Department of Mathematical Sciences and co-director of FAU’s MPCR laboratory; and Pedram Nimreezi, director of intelligent software in FAU’s MPCR laboratory, chief technology officer for RedGage and a martial arts expert.
“Our Machine Perception and Cognitive Robotics laboratory team was sought out by Drone Data’s Astro Robotics group because of their extensive expertise in cognitive neuroscience, which includes behavioral, neurophysiological and embedded computational approaches to studying the brain,” said Ata Sarajedini, Ph.D., dean of FAU’s Charles E. Schmidt College of Science. “Astro is inspired by the human brain and he has come to life through machine learning and artificial intelligence, which is proving to be an invaluable resource in helping to solve some of the world’s most complex problems.”
Editor’s Note: This article was republished from Florida Atlantic University’s Machine Perception and Cognitive Robotics Laboratory.