As animals go, humans have relatively limited senses. We can’t smell as well as dogs, see as many colors as mantis shrimp, or find our way home using the Earth’s magnetic poles as sea turtles do. But there’s one animal sense we can learn: bat-like echolocation.

Researchers in Japan demonstrated this feat in a paper published in the journal PLoS One, proving that humans can use echolocation—or the ability to locate objects through reflected sound—to identify the shape and rotation of various objects without light.

As bats swoop around objects, they send out high-pitched sound waves that then bounce back to them at different time intervals. This helps the tiny mammals learn more about the geometry, texture, or movement of an object.

If humans can similarly recognize these three-dimensional acoustic patterns, it could literally expand how we see the world, says study author Miwa Sumiya, Ph.D., a researcher at the Center for Information and Neural Networks in Osaka, Japan.

“Examining how humans acquire new sensing abilities to recognize environments using sounds, or echolocation, may lead to the understanding of the flexibility of human brains,” says Sumiya. “We may also be able to gain insights into sensing strategies of other species by comparing with knowledge gained in studies on human echolocation.”

illustration of dolpin using echolocation to locate fish
Dorling Kindersley//Getty Images
Dolphins also use echolocation to identify and hunt down fish.

This study is not the first to demonstrate echolocation in humans—previous work has shown that people who are blind can use mouth clicking sounds to “see” two-dimensional shapes. But Sumiya says that this study is the first to explore a particular kind of echolocation called time-varying echolocation. Beyond simply locating an object, time-varying echolocation would enable human users to better perceive its shape and movement as well.

To test subjects’ ability to sense echolocation, Sumiya’s team gave participants headphones and two tablets—one to generate their synthetic echolocation signal, and the other to listen to the recorded echoes. In a second room not visible to participants, two oddly shaped cylinders would either rotate or stand still. The cross-section of these cylinders resembles a bike wheel with either four or eight spokes.

When prompted, the 15 participants initiated their echolocation signals through the tablet. Their sound waves released in pulses, traveling into the second room and hitting the cylinders.

It took a bit of creativity to transform the soundwaves back into something the human participants could recognize. “The synthetic echolocation signal used in this study included high-frequency signals up to 41 kHz that humans cannot listen to,” Sumiya explains. For comparison, bat echolocation signals in the wild range from 9 kHz all the way to 200 kHz—well outside our range of hearing of 20 Hz to 20 kHz.

test setup with mannequin head and tablets
Image courtesy of Miwa Sumiya
When participants tap on the Android tablets, a synthetic echolocation signal is emitted from a loudspeaker (red lines). The recorded binaural sounds, whose pitch is converted to 1/8 of the original by lowering the sampling frequency, are presented to the participants through headphones (green lines).

The researchers employed a one-seventh scale dummy head with a microphone in each ear to record the sounds in the second room before transmitting them back to the human participants.

The microphones rendered the echoes binaural, like the surround-sound you might experience at a movie theater or while watching an autonomous sensory meridian response (ASMR) video recorded using a binaural mic. The signals were also lowered in frequency when received by the miniature head to an eighth of the original frequency so the human participants could hear them “with the sensation of listening to real spatial sounds in a 3D space,” says Sumiya.

Finally, the researchers asked participants to determine whether the echoes they heard were from a rotating or a stationary object. In the end, participants could reliably identify the two cylinders using the time-varying echolocation signals bouncing off the rotating cylinders by listening to the pitch.

They were less adept at identifying the shapes from the stationary cylinders. Nevertheless, the researchers say that this is evidence that humans are capable of interpreting time-varying echolocation.

Sumiya hopes it could one day help humans perceive their spatial surroundings in a different way; for example, helping visually impaired users better sense the shape and features of objects around them.

The next step for this research is to give participants freedom to move around when they’re interpreting these echolocation signals, Sumiya says. That will more closely mimic the action bats might take when using echolocation “because echolocation is ‘active’ sensing.”


How Visually Impaired People Develop a New Sense


Losing one sense can heighten others. It’s a phenomenon known as neural reuse or neural repurposing, in which the brain adapts and heightens remaining senses, and it has helped some people who are blind develop the ability to use two-dimensional echolocation by making clicking sounds with their mouths.

Research shows that a portion of the brain—the primary visual cortex located in the occipital lobe—involved with visual processing can restructure itself to treat the echoes resulting from the clicks as visual stimuli. In essence, the brain can “see” the echoes as they bounce back and use the sound to help a person reconstruct the space and objects around them. This has given some echolocators the ability to draw a room and its contents by merely walking around it while making clicking sounds and listening for echoes.

—Daisy Hernandez


🎥 Now Watch This:

preview for The Arecibo Observatory
Headshot of Sarah Wells
Sarah Wells

Sarah is a science and technology journalist based in Boston interested in how innovation and research intersect with our daily lives. She has written for a number of national publications and covers innovation news at Inverse.