Language selection

Search

Mysteries of Ocean Music: Deciphering whale sounds with the help of AI to better understand underwater ecosystems

Southern Resident killer whales swimming in Active Pass off coastal British Columbia. Credit: H. Yurk.

Southern Resident killer whales swimming in Active Pass off coastal British Columbia. Credit: H. Yurk.

As human beings, we tend to focus our attention on what we can see. But across the vast darkness of the ocean, what we can hear may be much more important. Blue whales have been known to communicate across distances of over 200 km underwater, to give just one example. And with the longest coastline in the world, Canadian scientists have their work cut out for them when it comes to trying to learn more about the mysteries of the deep ocean.

Perhaps not surprisingly, most interest and research tends to focus on marine coastal areas, where human activities such as fishing, tourism and recreational boating are more likely to occur, while large commercial ships travel significant distances across the oceans. Coastal areas seem to us to be where human activities are most likely to impact marine life—however, vessel collisions with marine mammals such as whales can occur everywhere and remain an ongoing issue for mariners.

An acoustic mooring being recovered after a 9 month deployment; photo courtesy of Caitlin O’Neill.

An acoustic mooring being recovered after a 9 month deployment; photo courtesy of Caitlin O’Neill.

So what happens when those whales stray beyond what we can see? The reality is that we don't have nearly enough information available to us to fully comprehend what happens under the surface in most of the ocean, and we know relatively little about whale-vessel interactions in outer shore areas.

Dr. Harald Yurk is an evolutionary ecologist with Fisheries and Oceans Canada, whose work focuses on animal bioacoustics—more specifically, using sounds made by ocean animals to detect and track their movements, and learn more about their behaviours. His team uses what's known as passive acoustic monitoring, or PAM, to detect whale calls and understand their movements. This approach allows them to monitor well beyond the 12 or so nautical miles in which most human activities generally tend to occur.

In practice, animal and other ocean sounds are recorded via moorings—essentially digital audio recording devices weighed down to the ocean floor—that are deployed to create a recording network that can help provide a picture of a given underwater soundscape. The recorded sounds are then used to create spectrograms, or images of sound, based on frequency and amplitude changes over time.

Naturally, our oceans can be a busy place, filled with a range of sounds, which researchers must filter through to find and identify whale calls and songs. With the help of AI-based machine learning, whale sounds can be isolated from the mix and categorized accordingly.

But knowing where whales are isn't enough. We also need to understand how and why they're behaving as they do, so that we can better understand where they may go in the future.

Fin whale breaching. Credit: Lucy Quayle.

Fin whale breaching. Credit: Lucy Quayle.

"We sometimes forget that most of the underwater world of our oceans is relatively unknown territory in terms of our understanding of how various ocean ecosystems function and interact between one another," Harald Yurk notes. "When it comes to whales and mitigating potential negative impacts from human interactions, as a starting point we need to be able to understand how they're using the spaces that they inhabit. For example, how long will a whale or group of whales stay in a given area, during a given season? And how, why and when do they move between areas?"

Just as music is generally distinguished from other sounds by the presence of discrete harmonic structures, identifying whale sounds helps us to bring knowledge and understanding to a cacophony of underwater noises. And understanding how whales behave in places where we can't see them may have a lot to tell us about their behaviours, which in turn can help us to better understand their needs and thereby co-exist respectfully.

Sound, Song, and Language

Dr. Yurk believes that music may have evolved before language, which could help explain why human beings tend to rely so heavily on non-verbal cues. Often, we can understand things that we can't quite explain in words.

Recording of whale vocalizations and other ocean noises

For example, we may hear a song that feels "warm" to us, without necessarily being able to explain why. Similarly, when Dr. Yurk listens to whale recordings he can generally tell which calls have been made by a killer whale (aka orca)  that eats other mammals versus one that eats fish. These two ecotypes of killer whales do not generally interact with one another because they belong to different cultures, so to speak, and the types of sounds that they make are distinct. Interestingly, these differences in both behaviour and food preferences among killer whales—which allows them to live in the same areas without competing for food—is a big part of why killer whales are the second most widely distributed species on planet Earth!

(In case you hadn't already guessed, we humans are the most widely distributed species for similar reasons, e.g. our ability to vary our vocal cultures, food preferences and foraging strategies allows us to live in many different places and environment types.)

"Resident killer whales eat fish, and can be quite vocal and noisy," says computer scientist Holly LeBlond, who helps analyze whale sounds as part of Dr. Yurk's team. "Transient killer whales on the other hand eat other mammals, and tend to be less vocal, maybe for obvious reasons in terms of not giving too much advance warning to their prey."

Much in the same way a song might feel 'warm' to us for reasons that remain obscure, whale sounds can be fed into machine learning systems and identified with reasonably strong accuracy even in the absence of clearly defined delineating markers. This sort of discrepancy is part of what makes recent research on artificial intelligence so fascinating.

"We can input certain whale sounds into our deep learning AI systems, and they can tell us what type of whale it is, without being able to explain how it's able to distinguish. The mysteries that we're trying to unravel remain theoretically elusive, even if we now have tools that can help us in very practical ways in certain contexts."

This is the crux of the work being carried out by Dr. Yurk and his team—only once they understand the 'why' can they discover what is really important to the whales, including how human activities impact them and in what ways. In other words, the AI can help us distinguish differences, but until we understand the deeper meanings we'll remain in the dark about what's really going on in a whale's mind.

Photographing whales offshore. Credit: H. Yurk.

Photographing whales offshore. Credit: H. Yurk.

"Our starting point is always: I don't know why animals do what they do. From there, we develop a hypothesis based on what we have learnt from other species and observations. Then we collect data, analyze, and try to better understand why this animal behaves the way it does at the time and place where it is found," Dr. Yurk sums up.

"Not knowing is not a bad thing, because that's where discovery begins," he underscores. "Learning more about what we don't understand can help us to live together better."

Date modified: