Google, in collaboration with the Georgia Institute of Technology and the Wild Dolphin Project – a nonprofit researching dolphins in the Bahamas – has announced significant progress in developing an artificial intelligence designed to decode dolphin communication and potentially even "speak" it.
Amid growing discussions about AI's impact on human life, this unique new model is extending the boundaries of language into the ocean.
The new model, DolphinGemma, was trained on tens of thousands of hours of acoustic recordings of Stenella frontalis dolphins, known for communicating through clicks, whistles, and rapid sequences of sounds called burst pulses. Dolphins use these complex sound sequences during play, courtship, and sometimes confrontations.
"When I first heard the sounds generated by the model, I danced around the room," said Dr. Thad Starner, an AI researcher with Google and Georgia Tech. "For years, I tried to produce burst pulses with regular software and couldn't do it. But the model created them on its own – no lines of code – just from what it learned from the data."
Unlike earlier approaches that attempted to translate dolphin sounds into human words, the new model doesn't attempt to "translate" at all. Instead, it identifies and constructs sound patterns that mimic dolphin language. The approach aims for the system to learn to associate specific sound patterns with specific contexts – like playing with a friend, encountering a new object, or facing off in a conflict.
The team is also developing technology called CHAT: a wearable audio system for divers that generates real-time AI-produced "dolphin" sounds. Researchers play a new sound while pointing to an object – such as a toy or sea plant – and observe whether dolphins repeat or consistently react to the sound.
"The goal is to see if dolphins can learn or recognize a new word," explained Dr. Denise Herzing, founder of the project. "Instead of spending 150 years manually analyzing all the recordings, the model allows us to detect patterns in days."
Do dolphins even speak in words like humans?
However, some experts caution that dolphins may not use actual "words" at all. Human language, they argue, involves grammar and limitless structures. Dolphins may use signals, but that doesn't necessarily constitute language as we understand it.
Still, even if the model doesn't translate or hold a "conversation," the fact that dolphins may recognize patterns or anticipate sound-based responses opens a valuable window into their cognitive abilities. Researchers believe that consistent responses to new sounds could demonstrate memory, learning, attention, and even basic communication.
Other global projects also aim to communicate with animals using AI, including the Earth Species Project, which focuses on crows, and CETI, which is working to decode sperm whale communication. As for whether we'll ever have a true "conversation" with a dolphin, the researchers answer cautiously: probably not. But the mere attempt to understand could transform the way we view our place in the animal kingdom.