Skip to main content

Cats Can Hide Their Pain—But Not from AI

Machine-learning software gets behind the inscrutable feline face and may improve pet care

A black and white cat with one yellow eye and one blue eye

Household cats are a secretive species. Unlike dogs, they are masters at masking their feelings and intentions—possibly because of their evolutionary history as solitary hunters. This built-in stoicism makes it hard for cat owners and veterinarians to read signs of pain in a cat’s facial expressions and behaviors, but new artificial intelligence programs may be able to finally peer behind the mask.

A team of AI researchers and veterinarians has created and tested two machine-learning algorithms that judged whether cats being treated in a veterinary hospital were experiencing pain based on the animals’ facial expressions. These automated systems, described in a recent Scientific Reports paper, were up to 77 percent accurate, suggesting the potential for powerful new veterinary tools. The study was co-led by Marcelo Feighelstein, then at Israel’s University of Haifa, and Lea Henze, then at the University of Veterinary Medicine Hannover in Germany.

The investigators plan to develop a mobile app that will let both veterinarians and cat owners snap a photograph to automatically detect pain, says Anna Zamansky, a computer scientist at the University of Haifa, who, along with Holger Volk of University of Veterinary Medicine Hannover, was a co-senior author of the paper. Although other AI developers have tried to unravel the secrets of feline emotions (an app called Tably, launched in 2021, also claims to do so), Zamansky says this study is the first to publish peer-reviewed scientific research about it.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Veterinarians currently measure feline pain using complex tests such as the Glasgow Composite Measure Pain Scale, which requires painstakingly examining an animal’s facial expressions and behaviors. Although scientifically validated, these scales rely on a veterinarian’s subjective assessment and are highly time-consuming. This discourages the use of such tests, says Stephane Bleuer, a veterinary behaviorist in Tel Aviv, who was not involved in the paper.

“Our belief is that the machine will do a better job,” Zamansky says of her team’s project. “The machine can see more than the naked human eye because it’s sensitive to subtle details of visual information.”

To develop the new model, the researchers needed data to train and test it. Photographs of 84 cats of various breeds and ages with varying medical histories were taken at the University of Veterinary Medicine Hannover’s animal hospital in Germany as part of standard care. The cats in these images had been scored based on the Glasgow scale and on the expected level of pain from their known clinical conditions—such as bone fractures or urinary tract problems. These measurements were used to train the team’s AI models and to evaluate their performance. The study authors say that none of their research inflicted any suffering on the cats.

The researchers created two machine-learning algorithms that could detect pain based on the cat photographs alone. One algorithm looked at the amount of facial muscle contraction (a common pain indicator) by using 48 “landmarks” involving the ears, eyes and mouth. The other algorithm used deep-learning methods for unstructured data to analyze the whole face for muscle contractions and other patterns.

The landmark-based AI approach was 77 percent accurate in identifying if a cat was in pain, but the deep-learning approach came in at only 65 percent. The researchers say this difference could stem from deep-learning systems being “data-hungry”—only a relatively small data set of images was available for this study.

The researchers also found that the cat’s mouth, instead of the ears or eyes, was the most important facial feature in accurate pain recognition, says study co-author Sebastian Meller, a veterinarian at the University of Veterinary Medicine. “We didn’t expect that, and that is also the beauty about AI, maybe,” Meller says. “It finds something in the forest of data that suddenly makes a difference that no one was thinking about before.”

It is important to distinguish between facial expressions and emotions, however, says Dennis Küster, a German psychologist with a background in emotion science, who was not involved in the study. Tests with humans have shown that AI tends to recognize facial patterns and not necessarily the meanings behind them, he explains. Moreover a facial expression may not always be associated with a particular emotion. “The best example is the social smile. So I might be smiling now, but maybe I just want to be friendly and indicate…, ‘Yeah, okay, let’s continue with this interview,’” Küster says. “We express certain things automatically, and they don’t necessarily mean that we are flowing over with happiness.”

Nevertheless, there are some contexts where emotion recognition AI can excel, he adds. Cats and other nonhuman species cannot vocalize what they are thinking or feeling, making it important for researchers to develop systems that can cross those communication barriers, says Brittany Florkiewicz, an assistant professor of psychology at Lyon College, who was not involved in the study. AI is only as good as the data it is fed, she notes. So ensuring the dataset is large, diverse and human-supervised—and that it contains contextual and nuanced information—will help make the machine more accurate, Florkiewicz says.

Florkiewicz recently found that cats can produce 276 facial expressions. She plans to collaborate with Zamansky’s team to gain deeper insights into felines’ emotional lives that will go beyond assessing whether or not they are in pain. Zamansky also plans to expand her research to include other species, including dogs, and to see whether automated systems can judge feline pain based on full-body videos.

Once a cat shows obvious signs of pain, it has probably been suffering for a long time; a convenient and practical pain app might allow for quicker detection of problems and could significantly advance cat care, Bleuer says. “When you improve the welfare of pets, you improve the welfare of people,” he says. “It’s like a family.”

This study focused on crossing interspecies communication barriers, and Zamansky points out that the researchers first had to overcome human ones: The international team members speak different languages, live in different countries and work in different disciplines. They are AI researchers, veterinarians, engineers and biologists. And their efforts ultimately aim to help a broad group of creatures encompassing cats, vets and pet owners. That effort led at least one researcher to cross a barrier of her own.

“Before we started this work, I was [completely a] dog person, but now I want to have a cat,” Zamansky says. “I think I fell in love with cats a bit.”

Editor’s Note (12/11/23): This article was edited after posting to clarify that Marcelo Feighelstein and Lea Henze co-led the study and that Holger Volk was a co-senior author.

Leila Okahata is a science writer based in Los Angeles. She previously worked in communications at the Allen Institute, a bioscience research nonprofit, and as a reporter for the Daily Bruin, the independent student newspaper of the University of California, Los Angeles. Follow her on LinkedIn and on X (formerly Twitter) @leilaokahata

More by Leila Okahata