Bats inspire new search and rescue drones through echolocation
A team of researchers at Worcester Polytechnic Institute (WPI) are developing tiny aerial drones, inspired by bats’ use of echolocation, for search and rescue purposes.
Led by an assistant professor of robotics engineering at WPI, Nitin Sanket, the team aims to create a cost-efficient alternative to helicopter-based search and rescue. With drones that can use echolocation, areas with conditions such as night-time darkness, wildfires, or fog can be searched during emergencies.
Sanket states that he has “always been fascinated by how nature’s expert flyers like insects and birds are able to effortlessly weave through tough obstacle courses while hunting prey”.
This curiosity caused Sanket to draw inspiration from nature, particularly from the biological workings of bats.
This process of navigation through echoes, ‘echolocation’, is the model process for WPI’s drones
Contrary to popular belief, bats are not blind, but depend primarily on their ears and voice to navigate and find their prey. As they fly, bats release shouting sounds, and the echoes from these return information to their ears, creating an acoustic map of their surroundings. Such information includes the size of a nearby insect, for example, allowing the bat to judge whether it wants to catch it for food. This process of navigation through echoes, ‘echolocation’, is the model process for WPI’s drones.
The drones, smaller than 100 millimetres (3.9 inches) and weighing less than 100 grams (3.5 ounces.), are tested in the university lab’s flying area. Here, the team, consisting of Prof. Sanket and undergraduate and graduate-level students, monitor how well the robots can navigate the area without relying on vision, instead using a sound-based sensing system.
A problem with this is noise interference, especially from the robots’ propellers, which has made it difficult for the ultrasound to focus on more subtle noises in the area.
To reduce this interference, the team uses metamaterials in the robots’ hardware design, which “change the geometry” of normal materials and allows “smart design” to “modulate the sound”, according to Sanket’s explanation.
Such ultrasound signals are expected to eventually be able to pick up survivors’ heartbeats, training the robots to perceive small but crucial signs of life in emergency conditions
Similar to “bats changing the shape of their ears to collect sound”, Sanket states that the team is “working with sensor manufacturers to emit low-power sound”, thus reducing the noise produced by the robot itself.
The robots’ software is made up of a combination of robot perception, bio-inspired AI, and robot learning which programs the drones to interpret ultrasound signals and navigate around obstacles.
Such ultrasound signals are expected to eventually be able to pick up survivors’ heartbeats, training the robots to perceive small but crucial signs of life in emergency conditions.
In the development of the drones, Sanket anticipates the use of sensor fusion, which will boost the efficiency of the sound-based system beyond the limits of vision-based systems. Sensor fusion combines data from multiple sensors to create an accurate input of environmental information.
Before deploying the robots in the real world, Sanket has aimed for the team to develop the speed of their obstacle avoidance to be faster than two metres per second (4.4 mph).
Expecting real-world application in three to five years, Sanket hopes that the robots can also be used to navigate unsafe areas, such as disaster zones and hazardous environments. Even beyond such initiatives, developments in sound-based navigation could benefit various other fields in science and technology, such as self-driving cars.
Comments