Every day, millions of us try to have conversations with our pets. We ask our dogs if they're good boys, we tell our cats about our day knowing they're judging us silently. At some point, you've probably looked at your pet and genuinely wondered: what are you actually thinking right now?

Due to AI, we might be closer to finding out.

Over the past decade, researchers have been using artificial intelligence to decode animal communication. The goal is something like Google Translate for the animal kingdom. Scientists want to understand what animals are saying to each other, rather than teaching them to use our language.

Recent research reveals that animal communication is far more sophisticated than we assumed. Dolphins have signature whistles that function like names, recognising each other even after decades of separation. In 2024, a study in Nature Ecology & Evolution found that elephants call out to specific individuals using unique vocal labels, suggesting "the ability for abstract thought."

Prairie dogs have one of the most sophisticated animal languages we've decoded. Dr. Con Slobodchikoff at Northern Arizona University found they have specific calls that encode what kind of predator, its size, its speed, and even colour. One call might essentially mean "tall human in blue shirt approaching quickly." That's functionally grammar.

Prairie dogs: Sophisticated.

In April 2025, a study in Science found that bonobos string together sequences of calls that form phrases with meanings beyond their individual parts. Dr. Mélissa Berthet concluded that "human language is not as unique as we thought." If the building blocks of language are shared across species, it changes how we think about our place in the natural world.

Machine learning algorithms excel at finding patterns in large datasets, which is exactly what decades of animal recordings represent.

5 animal interpretation projects

  1. Professor Yossi Yovel's team at Tel Aviv University collected 15,000 bat calls over two and a half months. When they ran this data through a machine learning algorithm, the AI learned to distinguish different types of arguments, predict whether bats were fighting over food or sleeping spots, and identify which specific bat was speaking by detecting acoustic features too subtle for human researchers to notice.

  2. Project CETI has been using deep learning to analyse sperm whale clicks. Human researchers had identified 21 different patterns in one Caribbean whale clan. When the AI got involved, it found 156 distinct patterns. Dr. Shane Gero says this reveals a much richer "phonetic alphabet" than previously realised.

  3. with Dr. Denise Herzing's Wild Dolphin Project developed an underwater computer called CHAT, programmed with invented whistle signals for objects dolphins like. Then one day, a dolphin spontaneously produced the "seaweed" whistle on its own. This was the first time an animal in the wild had seemingly used an invented "word" with a human.

  4. In Europe, scientists built tiny robot bees that can infiltrate hives and perform the waggle dance. After years of refinement, real bees started following the robot's instructions to new feeding sites. We've learned to speak bee well enough that actual bees will listen.

  5. Con Slobodchikoff has founded a company called Zoolingua, developing AI that interprets dog barks, whines, tail wags and ear positions. Many dogs end up in shelters due to "behavioural issues" that might actually be pain or anxiety that owners aren't recognising. An AI that could identify when a dog is in pain could lead to earlier veterinary care.

Why would animals want to talk to us?

This is genuinely difficult work, and we shouldn't expect to have philosophical debates with our pets about consciousness anytime soon.

Most animal communication evolved for them to talk to each other, not to us. Dr. Herzing emphasises that we have to be careful not to project our own desires onto animals.

There's a concept in biology called Umwelt, the idea that every species lives in its own perceptual world. A bee's waggle dance doesn't just convey "nectar 100 metres east." It includes the scent of flowers, the angle of the sun, the quality of the food source. Even if we decode what an animal is saying, we might be missing crucial context.

Deep neural networks are often black boxes. An AI might claim that a certain chimp call means "I found food," but verifying that requires playback experiments that are time-consuming and not always feasible.

There are also ethical implications. A global survey by the Earth Species Project found that people's biggest fear about this technology is that it will be used to manipulate animals rather than help them. Some ethicists are arguing that animals should have rights over recordings of their vocalisations.

I doubt conversations with my cat would be that interesting. Feed me now would be her general command. Most of the time she’s asleep or outside.

What strikes me about this research is that every discovery makes it harder to see animals as undifferentiated masses. When you learn that elephants have names for each other, that dolphins have regional dialects, that bonobos create new meanings by combining calls, it becomes difficult not to see them as individuals with cultures and societies of their own.

In the 1970s, when Roger Payne first played recordings of humpback whale songs to the public, people were so moved that it helped launch the Save the Whales movement. Now imagine if a translator could convey a whale saying, "The ocean is too noisy. We can't find each other anymore."

Within the next decade or two, we might have functional translators for certain species. If AI gives animals a voice, it won't just change how we see them. It'll change how we see ourselves. We'll no longer be the only minds on this planet that matter.

Watch the video: AI is Learning to Talk To Animals (And They’re Talking Back)

Reply

Avatar

or to participate