The ability of bats to understand and use sound has always been essential to their survival. Though these winged creatures have eyes, they rely heavily on echolocation, which means that they use the echoes of sounds they produce themselves to navigate their environment and understand where they and other objects—and creatures—are.
Using that fast-flying nocturnal animal's ability as a model, Sangwook Park, a postdoctoral fellow in the Whiting School of Engineering's Department of Electrical and Computer Engineering, is creating a computer model called an artificial midbrain that not only is improving the understanding of how creatures who use echolocation communicate and navigate, but also has implications for use in designing new systems and technologies, including enhancing submarines' sonar systems to help crews more accurately navigate thousands of feet underwater.
"We believe that we're able to invent the next generation of sonar system, which aims to identify a target, as well as to localize the target, based on the bat's fascinating ability in auditory perception," Park said.
A member of electrical and computer engineering Professor Mounya Elhilali's Laboratory for Computational Audio Perception, Park collaborated with Angeles Salles and Kathryne Allen, who are postdoctoral fellows from the lab of Cynthia Moss, a professor in the Krieger School of Arts and Sciences' Department of Psychological and Brain Sciences. Park's part of the project began with a hypothesis based on research from a previous study of Mexican free-tailed bats. That study suggested a possible connection between individual neurons founds in the bats' midbrain region and sounds—called "conspecific sounds"—made by that species.
Moss's group observed how the brains of big brown bats (also called "house bats" and a breed distinct from Mexican free-tailed bats) responded to sounds, and what the midbrain's role in that process was. The group recorded the animals' neural responses and then compared those findings to what went on in the big brown bats' midbrain during vocalization.
The results corroborated the same relationship between neurons and conspecific sounds in the big brown bat, even though it emits different call sounds than the Mexican free-tailed bat. In essence, the big brown bat's midbrain processes calls it makes itself the same way as it processes other sounds.
"This discovery allowed us to build a data-driven model: basically, an artificial midbrain of the bat that can understand and process sounds in much the same way as the bat does so successfully," Park said.
Park's artificial midbrain is a computer network that sends out sounds that have the same characteristics as natural sounds, and then emulates how a big brown bat responds to such sounds.
"We think that our work has a number of practical applications, including helping submarine crews understand their surroundings better," Park said. "Presently, the technology submarines use only tells the crew if something is nearby, rather than what it is exactly. When a sonar system receives multiple echoes from unknown objects, our model could help the crew recognize which one is a threat to damage the submarine—such as the bottom of the sea floor—and which is not, like a big fish floating by."
Park pursued this research because of his fascination with the brain. He joined Elhilali's lab for his postdoctoral fellowship so he could combine that interest with his engineering background. In addition to having top of the line facilities and equipment, Park also credits the culture at Hopkins for playing an integral role in this work.
"I have really liked that Hopkins gives you more room to grow as a researcher. It isn't all about hitting deadlines like it can be at other universities," Park said. "I felt fully free to just concentrate on my work; it's a great environment to conduct research."
Posted in Science+Technology