This approach revealed subtle differences in the cadence of each coda, where the time between clicks slowed or quickened. Borrowing from classical music, the researchers dubbed these rubatos. The system also revealed instances where whales added an extra click to the end of a coda. This “ornamentation,” as the researchers called it, seemed to carry meaning.
Subtle variations in rhythm, tempo, ornamentation, and rubatos reminded Sharma and her colleagues of phonemes, the fragments of sound that humans combine and recombine into words. It’s possible these codas are the basis of a complex language. Most of these nuances had not been distinguishable until now.
Before joining CETI, Sharma had considered earning a Ph.D. in robotics. She had never studied animals. She had never even seen a whale. “One of the cool things about Project CETI,” says Jacob Andreas, a natural language processing expert at MIT and one of the project’s researchers, “was getting a bunch of people who really think of themselves as computer scientists actually involved in this project of understanding animal communication.”
This kind of work isn’t limited to whales. Across the natural world, scientists and researchers are increasingly turning to artificial intelligence for help understanding the interior lives of animals, as well as the habitats that sustain them—oceans, forests, even commercial farms. Still mysterious in many ways, AI is already enabling very human connection with other living things—and, perhaps, a new way of thinking about the planet’s future.
(How artificial intelligence is changing wildlife research.)
Since 2020, Project CETI has brought together experts from varying disciplines and institutions, like these researchers at Harvard Science and Engineering Complex, to analyze groups of whale clicks called codas.
Photograph by Spencer Lowell
Suresh Neethirajan works at the cutting edge of another kind of computer-enabled animal interaction. A professor of computer science and agriculture at Dalhousie University in Nova Scotia, Canada, he studies how farmers can use real-time monitoring to interpret what different behaviors really mean.
Neethirajan grew up with livestock on a dairy farm in south India. His parents considered their cows partners in the endeavor—the humans were in charge, but they didn’t have a livelihood without the creatures and their milk. And so, when the cows stopped producing, they weren’t sent to slaughterhouses. They lived out their days on the farm as a thank-you for their service. It’s a “social-economic belief system,” Neethirajan says.
Neethirajan, who doesn’t eat meat, began studying the inner lives of farm animals about a decade ago—chickens, cows, horses, sheep, and pigs. As a “classically trained agricultural engineer and partially trained animal scientist,” he says, he wondered how he might use technology to improve their quality of life.
First, he had to collect data. He monitored body temperature, cortisol levels, hormones, and respiration and heart rates with biosensors as well as blood, stool, and hair samples. Then he paired that with audio and video footage, and added context, like an animal receiving food (positive) or hearing a startling noise (negative). The goal: to understand what it looks like when an animal is comfortable or unwell.
Seven years ago, Neethirajan began processing his data with AI, including a deep learning model that performs facial recognition and gait analysis in livestock. Like Sharma and the MIT team, he uses natural language processing tools to understand animal vocalization. His analysis can pinpoint the specific squawk that chickens make before they leave a room. Now, he explains, video footage of a barn full of 5,000 chickens can be put into the model, and within a few minutes it can identify the five birds most likely to be sick.
(How AI is helping scientists protect birds.)
Neethirajan’s work is an argument for why animal welfare is valuable even in an industry that doesn’t always prioritize the well-being of its living product. Identifying disease early prevents both suffering and financial losses. And there is research that shows a happier animal is a more productive animal; cows that live in a positive environment give more and better milk. Farms can act on this information. “They are thinking beings,” Neethirajan says of animals. “They have their own likes and dislikes.” He found chickens make fewer noises of distress when their habitats are cleaned with greater frequency, because they can breathe more easily—so why not increase the cleaning schedule?
“As human beings,” he says, “we exist with the plant kingdom, animal kingdom, and [other] humans, and our population is growing enormously … How do we peacefully coexist? How do we create harmony?” For this new band of AI-assisted researchers, these aren’t just existential questions anymore, which has led to an even grander experiment.
Discover our other stories on artificial intelligence
Jörg Müller heads conservation at the Bavarian Forest National Park, Germany’s oldest national park, and teaches forest ecology at the University of Würzburg. But his research takes him halfway around the world, to South America, where he’s developing a new kind of AI-enhanced “stethoscope” for monitoring tropical ecosystems that were cleared for agriculture. It’s relatively easy to survey the regrowth of forest canopy with satellites and remote sensing. It’s much harder to know how long it takes to recover native biodiversity—the flourishing creatures and plants living below the canopy. Müller works with statisticians, entomologists, ornithologists, and local communities in Ecuador to understand what signals he can trace that may be evidence of revitalization efforts working.
(How artificial intelligence can tackle climate change.)
Source link : http://www.bing.com/news/apiclick.aspx?ref=FexRss&aid=&tid=670e56de82274d24bad4ab8b81ef105b&url=https%3A%2F%2Fwww.nationalgeographic.com%2Fscience%2Farticle%2Fai-animal-language-whale-calls&c=7890131734656524833&mkt=en-us
Author :
Publish date : 2024-10-15 00:07:00
Copyright for syndicated content belongs to the linked Source.