Bat-Like Sonar Can Help A Robot Navigate

Bats can fly dexterously in the dark in large part because echolocation—the biological version of the sonar that submarines rely on—helps them even when the night blinds them. Now researchers have created what they said is the first robot to truly use echolocation like a bat to help it explore the world autonomously. This research could help lead to unmanned systems that can navigate even when they cannot ‘see’, that is rely on visual sensors, to the benefit of both flying drones and driverless cars.

MECHANICAL BAT

To echolocate, bats emit chirps and listen to the echoes of those sounds to glean information about their surroundings from the reflected sound waves. Bats’ echolocation ability routinely helps them simultaneously map and navigate through new areas—“engineering problems that are currently very difficult to solve,” said Itamar Eliakim, a graduate student at the mechanical engineering department of Tel Aviv University in Israel.

There have been many prior attempts to use echolocation to help robots map and navigate on land. However, these did not move autonomously. Instead, pilots used the sonar data to drive the robots. Moreover, whereas bats emit chirps from their throats and depend on just two ears to hear echoes, previous research mimicking bats usually relied on arrays of multiple speakers and multiple microphones.

Now, Eliakim and his colleagues have developed “Robat,” a robot that uses sonar like a bat to help it navigate autonomously. “Getting inspiration from animals can lead to new solutions,” said Yossi Yovel, an assistant professor of zoology at Tel Aviv University.

PROWLING THE GROUND

The prototype Robat does not fly, but rolls across the ground using the Komodo platform from Israel-based RoboTiCan. The robot was equipped with an ultrasonic speaker that imitated a bat’s mouth and two ultrasonic microphones spaced seven centimeters apart that mimicked bat ears that were all mounted on a DJI Ronin gimbal.

Whereas previous work involving airborne sonar for robots depended on speakers that each broadcast a narrow range of sound frequencies, Robat emitted a wide range of ultrasonic signals just like bats do. The echoes of those signals convey rich amounts of information about the objects and surfaces they bounce off. This helped Robat navigate with just one emitter instead of several.

In experiments where Robat moved at roughly a meter a minute, after every 30 seconds or so, it stopped and gave three chirps, each 10 milliseconds long, while aiming its speaker at three different angles. This helped the robot scan out to a range of about six meters.

Robat then used a kind of artificial intelligence system known as an artificial neural network to analyze the echoes. In such a system, components dubbed neurons are fed data and cooperate to solve a problem, such as recognizing images. The neural net repeatedly adjusts the behavior of its neurons and sees if these new patterns of behavior are better at solving the problem. Over time, the network discovers which patterns are best at computing solutions, and then adopts these as defaults, mimicking the process of learning in the human brain. This is the first ground-based robot to use a neural net to help it analyze sonar data, Yovel said.

The robot could recognize if objects were plants or not, and it changed its heading if necessary to avoid obstacles fully autonomously. In experiments, the Robat successfully navigated two greenhouses without colliding with anything.

“We are able to map a new environment accurately and we are able to use a machine-learning algorithm to learn to classify objects,” Yovel said. “We can solve the problem of autonomous navigation using sound like bats do.”

SOUND AND VISION

This echolocation research could benefit “any robot that needs to navigate—that is, most robots,” Eliakim said. This includes, he added, vacuum cleaner robots that need to maneuver around living rooms, agricultural robots navigating greenhouses and rescue robots threading their way through a collapsed building after an earthquake.

Unmanned systems such as driverless cars often navigate with the aid of LiDAR, which bounces light pulses off obstacles just as echolocation bounces ultrasonic signals. Eliakim noted a Robat-like approach might have a number of advantages over LiDAR.

To begin with, fog, smoke and glare can prove troublesome for LiDAR just as they do for eyes or visual sensors. Sonar can scan right through such challenges, Eliakim said.

In addition, LiDAR emits far more pulses than ultrasound. “This means LiDAR has a lot more data to churn, and requires a lot more computing power, which is a real disadvantage,” said Rolf Mueller, a professor of mechanical engineering at Virginia Tech. Mueller has also investigated bat echolocation and its possible uses for drones, although he did not work with Eliakim and his colleagues on this project.

LiDAR also can be expensive, Mueller said. In comparison, “we can develop our sensing unit for under $15,” Eliakim said.

Furthermore, many LiDAR systems that cost less than $100, such as those used in robotic vacuum cleaners, rely on 740-nanometer near-red wavelengths of light, Eliakim said. This is a big problem when working outdoors where reflected sunlight can lead to many inaccurate readings, making such systems “almost unusable in the daytime,” he explained.

One of the biggest problems LiDAR faces is images reflected off mirrors and glass. LiDAR can mistake reflections for real objects, or it can fail to recognize mirrors and windows as solid surfaces instead of open spaces. A Robat-like system would not have this problem, Eliakim said.

Still, ultrasound can have problems with reflected sound. For example, if a drone equipped with ultrasound were to approach a flat wall at a very shallow angle, “the ultrasound would get reflected off that wall in such a way that the drone would not see it, and it could crash into the wall,” Mueller said.

Moreover, LiDAR, time-of-flight cameras and other light-based commercial systems are more accurate at measuring distances than acoustic techniques, Eliakim said. Good optical systems are accurate to within 1 to 5 millimeters, whereas “in our acoustic solution, the accuracy is around 5 centimeters,” he said. (Bat sonar can achieve a resolution of roughly 1 centimeter, he noted.)

Ultrasound sensors are currently used in driverless cars for slow-movement applications such as autonomous parking or autonomous braking in traffic jams, Eliakim said. A Robat-like system could improve the resolution of these systems—for instance, while a driverless car’s ultrasound sensors can currently say there is an object on a sidewalk, a Robat-like system could point out the exact location of that object on the sidewalk, he explained. “However, cameras on cars might be able to solve the same problem,” Mueller said.

There are many possible improvements the researchers could add to Robat, “such as, for example, using bat-like ears placed around our microphones,” which would help channel sound into the sensors, Eliakim said. They also could train the neural network with many more echoes so it can learn to recognize more kinds of objects, he said.

And, of course, “in the future, we would also like to mount our sensing unit on a flying robot,” Eliakim said. “We are aiming to create a 3-D version of the system, and put it on a drone so we will be able to map an entire environment in 3-D.”