Home  >  Article  >  Technology peripherals  >  Can artificial intelligence really help us talk to animals?

Can artificial intelligence really help us talk to animals?

王林
王林forward
2023-04-12 16:58:041406browse

A dolphin trainer signals "together" with his hands, followed by "create." Two trained dolphins disappear underwater, exchange sounds and then surface, flipping onto their backs and raising their tails. They devised their own new tricks and performed them one after another as required. "This does not prove that language exists," says Aza Raskin. "But if they could use a rich, symbolic form of communication, it would certainly make the task easier."

Raskin is a co-founder of the Earth Species Project (ESP) and President, a California nonprofit with the ambition to use a form of artificial intelligence (AI) called machine learning to decode non-human communications and will expose all available proprietary technology, deepening our relationship with other biological species to help protect them. A 1970 album of whale songs inspired a movement that led to a ban on commercial whaling. What does Google Translate of the Animal Kingdom produce?

The organization, founded in 2017 with the help of major donors including LinkedIn co-founder Reid Hoffman, published its first scientific paper last December. The goal is to initiate communication with animals in our lifetime. "What we're working towards is whether we can decode animal communication and discover the mysteries of non-human speech," Raskin said. "In the process, equally important, we are developing technologies that support biologists and animal conservation."

Understanding the vocalizations of animals has long been a subject of fascination and inquiry for humans. The alarm calls produced by various primates vary depending on the predator; dolphins use signature whistles to call in friends; and some songbirds can extract elements from their calls and rearrange them to convey different messages. But most experts don't call it a language because no animal communication meets all the criteria.

Until recently, decoding relied primarily on painstaking observation. However, there is strong interest in applying machine learning to process the vast amounts of data that can now be collected by modern animal communication sensors. "People started using it," says Elodie Briefer, an associate professor at the University of Copenhagen who studies vocal communication in mammals and birds. "But we don't know yet how much we can do."

Briefer co-developed an algorithm that analyzes pigs' grunts to determine whether the animals are experiencing positive or negative emotions. Another method, called DeepSqueak, determines whether a rodent is stressed based on its ultrasonic calls. Another initiative—the CETI project (which stands for Cetacean Translation Initiative)—plans to use machine learning to translate sperm whale communications.

Can artificial intelligence really help us talk to animals?

Earlier this year, Elodie Briefer and colleagues published a study based on the vocal emotions of pigs. 7,414 sounds were collected from 411 pigs in various scenarios.

Yet ESP says its approach is different because it doesn't focus on decoding one species' communications, but all of them. While Ruskin acknowledges that the potential for rich symbolic communication is higher among social animals such as primates, whales, and dolphins, the goal is to develop tools that can be applied across the entire animal kingdom. “We are species agnostic,” Raskin said. "We develop tools... to work across all biology, from worms to whales." Raskin said his "intuition-stimulating" work on ESP shows that machine learning can be used in diverse , and translate between sometimes distant human languages—without any prior knowledge.

The process begins with developing an algorithm to represent words in physical phase space. In this multidimensional geometric representation, the distances and directions between points (words) describe how they are meaningfully related to each other (their semantic relationships). For example, the relationship between "king" and "man" is the same as the distance and direction between "woman" and "queen". (The mapping is done not by knowing what the words mean, but by seeing how often they are close to each other.)

It was later noticed that these "shapes" were similar for different languages. Then, in 2017, two groups of researchers working independently discovered a technique that could achieve translation by aligning shapes. To go from English to Urdu, align their shapes and find the Urdu word point that is closest to the English word point. "That way you can translate most words pretty well," Raskin said.

ESP’s aspiration is to create such representations of animal communication—working on single species and many species simultaneously—and then explore questions such as whether there is overlap with universal human communication “shapes.” We don't know how animals experience the world, Raskin said, but there are emotions, such as sadness and joy, that some animals seem to share with us and likely communicate with others in their species. "I don't know which is more incredible - the parts of the overlapping shapes that we can directly communicate or translate, or the parts with which we can't."

Can artificial intelligence really help us talk to animals?

Dolphins use clicks, whistles, and other sounds to communicate. But what are they talking about?

Animals communicate through more than just sound, he added. For example, bees use a "waggle dance" to let others know the location of a flower. Translation across different modes of communication is also required.

The goal is "like going to the moon," Ruskin admits, but the idea won't be achieved all at once. Instead, ESP’s roadmap involves solving a series of small problems to achieve a bigger picture. This should see the development of general tools that can help researchers trying to apply artificial intelligence to unlock the secrets of the species they study.

For example, ESP recently published a paper (and shared its code) on the so-called "cocktail party problem" in animal communication, in which it is difficult to tell which individual in a group of identical animals is communicating in a noisy Vocalization in social contexts.

"To our knowledge, no one has done this kind of end-to-end [animal voice] disentanglement before," Raskin said. The AI-based model developed by ESP, which was tested on dolphin signature whistles, macaque coos and bat vocalizations, worked best when the calls came from individuals the model was trained on; but with larger data sets, it was able to unravel Mixed calls from animals not in the training queue.

Another project involves using artificial intelligence to generate new animal sounds, using humpback whales as a test species. The novel calls - made by breaking the vocalizations into microphones (different sound units lasting one hundredth of a second) and using language models to "speak" something like a whale - can then be played back to the animals to see how they respond. If AI can identify the causes of random changes versus semantically meaningful changes, it could bring us closer to meaningful communication, Raskin explained. "It will allow artificial intelligence to speak the language, even though we don't know what it means yet."

Can artificial intelligence really help us talk to animals?

Hawaiian crows are known for using tools, but are also thought to have a Particularly complex vocalizations.

Another project aims to develop an algorithm that determines how many call types a species has by applying self-supervised machine learning, which does not require any labeling of the data by human experts to learn patterns. In an early test case, it will mine recordings made by a team led by Christian Rutz, professor of biology at the University of St. Andrews, to create an inventory of the vocal repertoire of Hawaiian crows — Rutz discovered the vocal repertoire of Hawaiian crows with the ability to make and The ability to use foraging tools and is thought to have a more complex vocal repertoire than other crow species.

Rutz is particularly excited about the project’s animal conservation value. The Hawaiian crow is critically endangered and only exists in captivity, where it is bred for reintroduction into the wild. It is hoped that by recording records over time, it will be possible to track whether the species' call repertoire has been eroded in captivity - for example, specific alarm calls may have been lost - which could have implications for its reintroduction; this loss could be mitigated through intervention solve. "This could lead to a step forward in our ability to help these birds recover from crises," Rutz said, adding that manual detection and triage calls would be labor-intensive and error-prone.

Meanwhile, another project seeks to automatically understand the functional meaning of vocalizations. It's being studied in the lab of Ari Friedlaender, a professor of marine sciences at UC Santa Cruz. The lab studies how wild marine mammals, which are difficult to observe directly, move underwater and runs one of the largest tagging programs in the world. Small electronic "bio-recording" devices attached to animals can capture their location, types of movements, and even what they see (these devices can contain cameras). The lab also has data from strategically placed recorders in the ocean.

ESP aims to first apply self-supervised machine learning to labeled data to automatically measure what an animal is doing (such as whether it is eating, resting, traveling or socializing), and then adding audio data to see if it can empower Functional meaning is associated with the call. (Any findings can then be verified using playback experiments, as well as previously decoded calls.) The technique will initially be applied to humpback whale data—the lab has already tagged several animals in the same group, so it can see how Send and receive signals. Friedlander said he has "reached a ceiling" in terms of what currently available tools can tease out of the data. "We hope that the work ESP can do will provide new insights," he said.

​But not everyone is so enthusiastic about the power of artificial intelligence to achieve such ambitious goals. Robert Seyfarth, professor emeritus of psychology at the University of Pennsylvania, has been studying social behavior and vocal communication among primates in their natural habitats for more than 40 years. While he thinks machine learning can solve some problems, such as identifying an animal's vocal repertoire, there are other areas, including discovering the meaning and function of vocalizations, where he suspects it will pose many problems.

The problem, he explains, is that while many animals can have complex societies, their repertoire of sounds is much smaller than that of humans. The result is that the exact same sound can be used to mean different things in different contexts, and this can only be done by studying the context - who the individual is calling, how they relate to others, where they fit in the hierarchy , who they interact with—and whose meaning can hopefully be established. “I just don’t think these AI approaches are enough,” Seyfarth said. "You have to go out and see the animals." dance".

There are also questions about the concept itself—that the forms of animal communication would overlap in meaningful ways with human communication “shapes.” It's one thing to apply computer-based analysis to human language, which we're so familiar with, Seyfarth said. But doing so for other species could be "completely different." "It's an exciting idea, but it's a big stretch," said Kevin Coffey, a neuroscientist at the University of Washington who co-created the DeepSqueak algorithm. Can artificial intelligence really help us talk to animals?

Raskin acknowledged that artificial intelligence alone may not be enough to unlock communication with other species. But he points to research showing that the way many species communicate is "more complex than humans thought." The stumbling blocks are our ability to collect enough data and conduct large-scale analysis, and our own limited knowledge. "These are the tools that allow us to take off our human glasses and understand the entire communication system of the species," he said.

The above is the detailed content of Can artificial intelligence really help us talk to animals?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete