Primates give alarm calls that differ according to predator, dolphins address one another with signature whistles, and some songbirds can take elements of their calls and rearrange them to communicate different messages- animal vocalization has always been the subject of human fascination.
Until recently, decoding relied on painstaking observation. But interest flourished in applying machine learning to deal with the vast amounts of data that modern animal-borne sensors can now collect.
A California-based non-profit organization, Earth of Species Project (ESP), has a bold ambition- to decode non-human communication using AI and ML. They intend to make all the know-how publicly available, thereby deepening our connection with the other living species. This is an attempt by the organization to protect various species using AI.
When a dolphin handler makes the signal, two trained dolphins disappear underwater and exchange sounds when they emerge. A trained circus elephant can sit on a chair upon the command of its instructor. It doesn’t prove that there’s language used as a medium. In a conversation with the Guardian, Aza Raskin, the founder of the Earth Species Project, stated that animals have a “rich, symbolic way of communicating”.
Projects and research
A group of researchers at MIT and Google applied AI-enhanced abilities to ancient scripts- Linear B and Ugaritic. Initially, word-to-word relations for a specific language are mapped. Then, the system searches texts to see how often each word appears next to every word. According to the researchers, all languages can be best described as having 600 independent dimensions in this space. This research has great significance in understanding animal communication. For example, if whales’ songs communicate in a word-like structure, then the whales’ relationships for their ideas are multi-dimensional, similar to human beings.
Elodie Briefer, an associate professor at the University of Copenhagen who studies vocal communication in mammals and birds, once stated that though people have started to use AI, we still have little knowledge about what we can do.
Briefer had co-developed an algorithm that analyses pig grunts and detects if the animal is experiencing positive or negative emotion. Another algorithm, DeepSqueak, judges if a rodent is in a stressed state. An initiative called Project CETI plans to use ML to translate the communication of sperm whales.
A further project aims to develop an algorithm that ascertains how many call types a species has at its command. Self-supervised machine learning is the algorithm utilized here. In an early test, it will mine audio recordings to produce an inventory of the vocal repertoire of the Hawaiian crow. The Hawaiian crow is a species that can make and use tools for foraging and is believed to have a more complex set of vocalizations than other species.
A University of California professor seeks to automatically understand the functional meanings of vocalizations. The lab studies how wild marine mammals, which are difficult to observe directly, behave underwater. Small electronic ‘biologging’ devices attached to the animals capture their location, type of motion and even what they see.
Power of AI
Not everyone is gung-ho about the power of AI. For instance, Robert Seyfarth at the University of Pennsylvania believes ML can be useful for problems such as identifying an animal’s vocal repertoire. There are other areas, including discovering the meaning and function of vocalization. But he is skeptical if there will be more.
The role of AI in the field of animal vocalization is still under study. Raskin acknowledges that AI alone may not be able to unlock communication with other species. Various species communicate in a manner more complex than humans have ever imagined. The bottleneck experienced in data collection and analysis is a significant challenge that the researchers experience.
Source: indiaai.gov.in