The Earth Species Project: Artificial Intelligence to Decode Animal Languages


Photo: Unsplash

The article can only be viewed with JavaScript enabled. Please enable JavaScript in your browser and reload the page.

The Earth Species Project of California wants to decipher the language of animals using artificial intelligence.

The Earth Species Project (ESP) is an open source non-profit organization founded in 2017, funded in part by donations from LinkedIn co-founder Reid Hoffman.

The organization’s main concern is deciphering non-human language. The 10-person team believes that understanding non-human languages ​​will deepen our connection to other species, increase our ability to protect them, and thus positively change our ecological footprint.

The ESP team wants to achieve its purpose in our lives. On the way there, it also wants to develop more technologies that directly support biology and nature conservation.

The Earth Species Project is based on large linguistic models

If ESP has its way, artificial intelligence will make it possible to understand non-human language. The use of machine learning to analyze communication and other forms of behavior in the animal kingdom is well known: a research group led by the University of Copenhagen has demonstrated an artificial intelligence system that analyzes the grunts of pigs, the CETI project wants to translate the calls of sperm whales and DeepSqueak helps interpret the calls that mice and rats understand.

However, the ESP team has set its goals much higher: it wants to decrypt all kinds of connections. “We are species neutral,” said Azza Raskin, co-founder of ESP. “The tools we are developing can be used across biology, from worms to whales.”

Raskin and his co-founders and his team have drawn inspiration from developments in natural language processing over the past few years. Work that showed that machine learning can be used to translate between many languages ​​without prior knowledge. Raskin calls this ESP’s motivating intuition.

Communication is many vectors in multidimensional space

These successes were based on algorithms that represented words or word components in a multi-dimensional geometric manner. The distance and direction to other words in space represent the primitive semantic relationships between individual words.

In 2017, several publications showed that translations can be created by overlaying the geometric representations of two languages. In the room, for example, “Hund” and “Dog” are close to each other and are recognized by an algorithm as a translation.

More progress has been made in work from Facebook’s AI lab since 2018. The team combined self-monitored training with back-end translations and achieved translation quality that was high at the time without prior knowledge. Today, huge language models translate up to 200 languages ​​simultaneously, such as Metas NLLB-200.

The ESP team would like to enable such representations to communicate with animals, both for individuals and for many species at the same time. According to Raskin, this should also include nonverbal forms of communication such as bee dances. Using such large models, one can then check whether there are overlaps in geometric representations between humans and other living things.

Logo

“I don’t know what would be more amazing — the parts where shapes intersect and we can communicate or translate directly, or the parts where we can’t talk,” Raskin says.

AI can help take off human eyeglasses

Raskin compares the development of a translation model between animals and humans to a trip to the moon—the journey will be long and difficult. Meanwhile, there are many other issues to be resolved along the way, and ESP has some ideas to address them.

In a recently published work, the team deals with the “cocktail party problem.” It is mainly about identifying individual voices in a social setting. Google and Amazon, for example, use artificial intelligence solutions to this problem to better recognize the voice input of their digital assistants. Meta works on a type of superior hearing aid with the ability to specifically pick out individual sounds.

When looking for non-human communication, the team said, the cocktail party problem is also present. So it is developing an artificial intelligence algorithm that can isolate individual animal sounds from natural background noise.

In another project, an AI system generates random calls to a humpback whale and analyzes how the whales interact with them. The goal is a system that learns to distinguish between random and meaningful changes. Raskin believes this will bring people one step closer to understanding the calls of the humpback whale.

In another project, the AI ​​self-learning system is to learn the song repertoire of the Hawaiian crow, another species called egogograms, which records all possible patterns of behavior, their frequency and the conditions of their structure for the species.

Is AI alone sufficient to enable communication with other species? Raskin believes that AI will at least bring us one step closer.

Many species communicated in a more complex way than previously thought. AI can help collect and analyze sufficient data at scale. Eventually, we may be able to take off our human glasses and understand entire communication systems that were hidden from us until now, Raskin says.


Leave a Comment