Cartoonists have captivated generations by humanising mice, from the enigmatic Mickey Mouse and charming Stuart Little to the smooth-talking Speedy Gonzales and wily Jerry, who continually outsmarts Tom, the dumb housecat.
Turns out, they might have been onto something – at least when it comes to the little critters having emotions – according to research published in the journal Science.
Back in 1872, Charles Darwin proposed that the universal, innate and communicative emotions of animals and humans can be best understood through facial expressions.
Humans clearly use the same expressions to convey emotion. For instance, disgust makes us wrinkle our nose, narrow our eyes and distort our upper lip; if we’re happy we smile and if something makes us sad our lips droop down at the edges.
Even newborn babies show sadness, happiness and disgust with recognisable facial expressions.
But studying facial expressions in animals, such as non-human primates, dogs, horses and sheep, to understand their emotions has been laborious, subject to bias in human scoring and hard to reproduce.
By applying machine learning to the facial expressions of mice, researchers at the Max Planck Institute of Neurobiology, Germany, were able to identify distinct emotional states of pleasure, disgust, nausea, pain and fear, and even their relative strength.
Importantly, the study showed these weren’t merely a reaction to a sensory stimulus, but reflected the inner, underlying emotion.
“Mice that licked a sugar solution when they were thirsty showed a much more joyful facial expression than satiated mice,” explains senior author Nadine Gogolla.
Similarly, a slightly salty solution produced a “satisfied” expression, while too much salt led to a “disgusted” face.
The challenge of developing a reliable test of emotions from facial expressions has so far thwarted efforts to study their neurological origins.
With their new method in hand, the neurobiologists were able to investigate the underlying brain activity. They could even elicit different facial expressions when they light-activated specific brain areas linked to emotional processing.
This opens new doors for understanding the neurobiology of emotions.
“With our automated face recognition system,” says lead author Nejc Dolensek, “we can now measure the intensity and nature of an emotion on a timescale of milliseconds and compare it to the neuronal activity in relevant brain areas.”
That mice have emotions might not stop researchers from using them as, well, guinea pigs. Gogolla certainly wants to use them to further understand emotions and what goes wrong in disorders like anxiety and depression.
“By recording facial expressions,” she says, “we can now investigate the fundamental neuronal mechanisms behind emotions in the mouse animal model.”
In a related commentary, Benoit Girard and Camilla Bellone, from the University of Geneva, Switzerland, acknowledge the new opportunities offered by this research.
They even take it a step further, musing whether the approach “will allow sufficient understand of emotions to build robots that can read and react to human emotion to better interact in our society”.