Sunday, November 24

Half a century ago, one of the hottest questions in science was whether humans could teach animals to talk. Scientists tried using sign language to converse with apes and trained parrots to deploy growing English vocabularies.

The work quickly attracted media attention — and controversy. The research lacked rigor, critics argued, and what seemed like animal communication could simply have been wishful thinking, with researchers unconsciously cuing their animals to respond in certain ways.

In the late 1970s and early 1980s, the research fell out of favor. “The whole field completely disintegrated,” said Irene Pepperberg, a comparative cognition researcher at Boston University, who became known for her work with an African gray parrot named Alex.

Today, advances in technology and a growing appreciation for the sophistication of animal minds have renewed interest in finding ways to bridge the species divide. Pet owners are teaching their dogs to press “talking buttons” and zoos are training their apes to use touch screens.

In a cautious new paper, a team of scientists outlines a framework for evaluating whether such tools might give animals new ways to express themselves. The research is designed “to rise above some of the things that have been controversial in the past,” said Jennifer Cunha, a visiting research associate at Indiana University.

The paper, which is being presented at a science conference on Tuesday, focuses on Ms. Cunha’s parrot, an 11-year-old Goffin’s cockatoo named Ellie. Since 2019, Ms. Cunha has been teaching Ellie to use an interactive “speech board,” a tablet-based app that contains more than 200 illustrated icons, corresponding to words and phrases including “sunflower seeds,” “happy” and “I feel hot.” When Ellie presses on an icon with her tongue, a computerized voice speaks the word or phrase aloud.

In the new study, Ms. Cunha and her colleagues did not set out to determine whether Ellie’s use of the speech board amounted to communication. Instead, they used quantitative, computational methods to analyze Ellie’s icon presses to learn more about whether the speech board had what they called “expressive and enrichment potential.”

“How can we analyze the expression to see if there might be a space for intention or communication?” Ms. Cunha said. “And then, secondly, the question is could her selections give us an idea about her values, the things that she finds meaningful?”

The scientists analyzed nearly 40 hours of video footage, collected over seven months, of Ellie’s using the speech board. Then, they compared her icon presses to several simulations of a hypothetical speech board user who was selecting icons at random.

“They were ultimately all significantly different at multiple points from the real data,” said Nikhil Singh, a doctoral student at M.I.T. who created the models. “This virtual user that we had wasn’t able to fully capture what the real Ellie did when using this tablet.”

In other words, whatever Ellie was doing, she did not seem to be simply mashing icons at random. The design of the speech board, including icon brightness and location, could not fully explain Ellie’s selections either, the researchers found.

Determining whether or not Ellie’s selections were random “is a very good place to start,” said Federico Rossano, a comparative cognition researcher at the University of California, San Diego, who was not involved in the research. “The problem is that randomness is very unlikely.”

Just because Ellie was not hitting icons randomly does not mean that she was actively and deliberately trying to communicate her true wants or feelings, Dr. Rossano said. She may simply have been repeating sequences she learned during training. “It’s like a vending machine,” he said. “You can learn to push a sequence of numbers and get a certain type of reward. It doesn’t mean that you’re thinking about what you’re doing.”

To further probe the possibilities, the research team then looked for signs of what it called “corroboration.” If Ellie selected the apple icon, did she eat the apple that she was given? If she selected a reading-related icon, did she engage with the book for at least a minute?

“You can hand something to a bird, and they’ll throw it or they’ll touch it,” Ms. Cunha said. “But for us it was about, Did she engage with it?”

Not all of Ellie’s selections could be evaluated in this way; it was impossible for the researchers to determine, for instance, whether she was truly feeling happy or hot in any given moment. But of the nearly 500 icon presses that could be assessed, 92 percent were corroborated by Ellie’s subsequent behavior.

“It’s clear that they have a good correlation there,” said Dr. Pepperberg, who was not involved in the research.

But demonstrating that Ellie truly understands what the icons mean will require additional testing, she said, suggesting that the researchers try deliberately bringing Ellie the wrong object to see how she responds. “It’s just another control to make sure that the animal really has this understanding of what the label represents,” Dr. Pepperberg said.

Finally, the researchers tried to assess whether the speech board was serving as a form of enrichment for Ellie by analyzing the types of icons she selected most frequently.

“If it’s a means to an end, what is the end?” said Rébecca Kleinberger, an author of the paper and a researcher at Northeastern University, where she studies how animals interact with technology. “It does seem like there was a bias toward social activity or activity that means remaining in interaction with the caretaker.”

Roughly 14 percent of the time, Ellie selected icons for food, drinks or treats, the researchers found. On the other hand, about 73 percent of her selections corresponded to activities that provided social or cognitive enrichment, such as playing a game, visiting another bird or simply communicating with Ms. Cunha. Ellie also initiated the use of the speech board 85 percent of the time.

“Ellie the cockatoo interacted consistently with her device, suggesting that it remained engaging and reinforcing for her to do so over several months,” said Amalia Bastos, a comparative cognition researcher at Johns Hopkins University, who was not an author of the paper.

The study has limitations. There’s a limit to what scientists can extrapolate from a single animal, and it’s difficult to rule out the possibility that Ms. Cunha might have been unconsciously cuing Ellie to respond in certain ways, outside experts said. But scientists also praised the researchers’ systematic approach and modest claims.

“They are not saying, ‘Can the parrot talk?’” Dr. Rossano said. “They are saying, ‘Can this be used for enrichment?’”

Dr. Bastos agreed. “This work is a crucial first step,” she said. It’s also an example of how the field has changed, for the better, since the 1970s.

“Researchers currently working in the area are not bringing the same assumptions to the table,” Dr. Bastos said. “We don’t expect animals to understand or use language in the way that humans do.” Instead, she added, scientists are interested in using communication tools to “improve the welfare of captive animals and their relationships to their caretakers.”

Share.

Leave A Reply

20 + twenty =

Exit mobile version