If we could decode what whales are saying, what would we learn about their culture and the world around us?
external project website | gallery | news
The Cetacean Translation Initiative (CETI) is an interdisciplinary endeavor applying robotics, marine biology, linguistics, and machine learning to listen to and translate the communication of sperm whales. These majestic deep-diving creatures have matriarchal multicultural societies and highly structured Morse code-like vocalizations. By studying these intricate encodings and behaviors, we hope to learn more about the world around us and about applying machine learning to non-human languages.
To do this, CETI is developing a network of deployable sensors, including tags that gently attach to whales using suction cups and record audio, depth, motion, temperature, and more. Using these devices, alongside others ranging from drones to buoys, we are curating an ever-growing multimodal dataset. This lays the foundation for the myriad of machine learning pipelines we are currently applying to uncover insights about the whales’ language and interactions.
Within this project, I have been focusing on physiological sensors, robotics, machine learning, and multimodal data visualization. My contributions have included biosignal acquisition, sensor design, embedded programming, electrical engineering, device synchronization, visualization techniques, drone control, and machine learning.
I’ve also had the opportunity to take multiple trips to Dominica in the Caribbean, where I met our whale colleagues and helped deploy our newest sensor tags. We even got to see (and record) a sperm whale birth, which was uniquely exhilarating.
Photos from drones: Project CETI | Other photos: Joseph DelPreto / Project CETI