
WPI researchers were inspired by birds and bats, which can navigate in low-visibility terrains filled with obstacles. | Source: Worcester Polytechnic Institute
A researcher at Worcester Polytechnic Institute is taking inspiration from bats to develop tiny flying robots for search and rescue. Nitin Sanket, an assistant professor of robotics engineering at WPI, is leading the project.
Imagine: a person is missing. The weather is poor. Maybe there’s fog, smoke, or dust. Maybe nightfall is closing in. The need to find the person doesn’t stop, but the conditions could prevent some aircraft from searching. Sanket doesn’t see this as an insurmountable problem.
“Helicopter-based search and rescue can cost as much as $100,000 per mission,” he told The Robot Report. “Lidar is good but power-hungry, and you can’t wait for smoke to clear in emergency conditions.”
Inspired by bats’ ability to navigate in damp, dusty caves, Sanket’s team is designing aerial drones that can use echolocation. Navigating based on ultrasound technology can expand the area that can be searched at night, in wildfires, or in fog.
“I have always been fascinated by how nature’s expert flyers like insects and birds are able to effortlessly weave through tough obstacle courses while hunting prey,” Sanket said. “Our robots, though very complex, are no match for these biological flyers. This led me to ponder how we can draw inspiration from nature to build better autonomous aerial robots.”
Sanket and his team at WPI are creating both the hardware and software to allow aerial robots to fly autonomously. They use artificial intelligence to teach the robot how to filter and interpret sound signals and to learn how to navigate and avoid obstacles.
They have also designed hardware to minimize noise interference and to improve the reliability of the robots’ performance.
Sanket earns NSF grant to advance echolocation
Prof. Sanket received a $704,908 grant from the National Science Foundation to develop these aerial robots. He’s working with undergraduate and graduate level students at WPI on the project in his laboratory on campus. The university‘s lab is equipped with a flying area where the team can test the robots they’ve designed and programmed.
The project focuses on enabling aerial robots, smaller than 100 mm (3.9 in.) and weighing less than 100 g (3.5 oz.), to navigate without relying on vision. Instead, Sanket’s team will develop a sound-based sensing system. However, ultrasound is a tricky form of sensing.
The whir of robot propellers produces significant noise, and ultrasound typically struggles to distinguish small features. Sanket’s approach is tackling these challenges on multiple fronts. From a hardware perspective, the team uses metamaterials to reduce noise interference.
“When you have a normal material, it has general properties, but when you change the geometry, it starts behaving differently,” he said. “Smart design lets it modulate the sound. Think of flat plastic versus a squiggly design, which reflects less — think of the foam used in sound baffling.”
“We’re doing something similar to humans cupping their ears or bats changing the shape of their ears to collect sound,” explained Sanket. “We’re working with sensor manufacturers to emit low-power sound. Bats can scream at 140 decibels — this is hundreds of times less.”
The WPI researchers are also exploring different modes of propulsion such as flapping wings, he said.
WPI develops AI to help parse signals, coordinate drones
When it comes to software, the team applies physics-informed deep learning to filter and interpret the ultrasonic signals. The team uses a hierarchical reinforcement learning navigation stack that teaches robots how to move toward goals while avoiding obstacles.
“We’re carefully designing the neural network, which the mechanical design helps make smaller,” said Sanket. “I first tell students, it has to work on the robot — there is no cloud, no infrastructure. We’re figuring it out as we go.”
Through this combination of robot perception, bio-inspired AI, and robot learning, Sanket aims to build low-cost, power-efficient drone swarms that can succeed where vision-based systems can’t.
He said he expects drones to use sensor fusion to complement other sensor modalities with echolocation. It could eventually use ultrasound to help detect survivors’ heartbeats.
“Ultrasound’s resolution is poor compared with cameras, but we can go back to the biological models and work with IMUs [intertial measurement units] and other sensors,” said Sanket. “Bats co-evolved both things.”
“We already do obstacle avoidance fairly well, but we want to do this faster than 2 m/sec. [4.4 mph], which is slow for search and rescue,” he noted. “At freeway speeds in a forest, sounds get compressed, which we have to consider in the models. We’re getting ready for deployment in the real world in three to five years.”
Other applications beyond search and rescue could include monitoring in disaster zones and hazardous environments, he said.





Tell Us What You Think!