No menu items!

    Scientists have begun using the power of machine learning to decipher messages throughout the animal kingdom

    Companies around the world are adopting machine learning to work with the vast amounts of data that can now be collected by modern animal sensors. This will allow decoding and learning the “language” of animals, writes The Guardian.

    With the help of modern technology, scientists have already developed an algorithm that analyzes the grunt of a pig to determine whether the animal is experiencing positive or negative emotions. A system called DeepSqueak uses ultrasonic signals to determine if rodents are stressed, and Project CETI plans to use machine learning to translate messages from sperm whales.

    The Earth Species Project (ESP) team hopes to decipher non-human communication using machine learning – and make this know-how publicly available. The ESP approach is unique in that it focuses not on deciphering one kind of communication, but all at once. Although experts recognize that there will be a higher likelihood of rich symbolic communication between social animals (such as primates, whales, and dolphins), the goal is to develop tools that could be applied to the entire animal kingdom.

    The tools we develop can work across all biology, from worms to whales.

    Aza Raskin, co-founder of ESP

    According to Raskin, the work of the algorithms has shown that they can be used to translate between different, sometimes distant human languages ​​- without the need for any prior knowledge.

    This process begins with the development of an algorithm for representing words in physical space. In this multidimensional geometric representation, the distance and direction between points (words) describes how they meaningfully relate to each other (their semantic relationships). For example, “king” refers to “man” at the same distance and in the same direction as “woman” to “queen”.

    Later it was noticed that these “figures” are the same for different languages. And then, in 2017, two groups of researchers, working independently, found a technique to achieve translation by aligning shapes. To go from English to Urdu, you need to align their shapes and find the Urdu dot closest to the dot of the English word. So you can translate most words quite accurately.

    ESP plans to create these kinds of representations of animal communication, working with single species or with many species at the same time, and then explore questions such as whether there is an intersection with the universal human form.

    We don’t know how animals perceive the world, but there are emotions, such as grief and joy, that some seem to share with us and may well communicate with others of their species. Aza Raskin, co-founder of ESP

    At the same time, animals communicate not only by voice. Bees, for example, communicate the location of a flower to others through a “wag dance.” Such forms of communication will also have to learn how to translate.

    ESP also tried to solve the so-called “cocktail party problem” in animal communication, where it is difficult to distinguish which individual in a group of the same animals sings in a noisy social environment. The AI-based model developed by ESP has already been tested on the characteristic whistles of dolphins, the calls of macaques and the vocalizations of bats.

    Cocktail Party Problem"
    Cocktail Party Problem”

    Another ESP project involves using AI to create new animal calls with humpback whales as a test species. The new calls, made by dividing vocalizations into mikronemes (individual sound units lasting hundredths of a second) and using a language model to “say” something similar to whales, can then be played back to animals to see how they react. If AI can determine what causes a random change, rather than a semantically significant one, this will bring it closer to meaningful communication with animals.

    There are several other similar projects, but they all face one significant limitation: although many animals can have complex communities, their repertoire of sounds is much smaller than that of humans. As a result, the same sound can be used to mean different things in different contexts, and this is only possible by learning the context. This means that existing AI methods are not enough.

    There are also doubts about the concept that the form of animal communication overlaps with human communication in a meaningful way. Applying computer analysis to human language, with which we are so familiar, is one thing, but it can work quite differently in the case of other species.

    As seen on PlayGround

    Latest articles

    Related articles