What happens if you take a person's happy face, put it on an angry-looking body and place that in front of a disgusting scene? What do people see? What emotion do people perceive when you mix and match these different cues?
These are some of the questions Department of Psychological Sciences Professor Eric Walle and Brigham Young University (BYU) Professor Peter Reschke (UC Merced '18) looked at in their study titled "The Unique and Interactive Effects of Faces, Postures and Scenes on Emotion Categorization." The paper was recently published in the journal Affective Science.
Most of the research over the past 30 to 40 years regarding this subject has focused on facial expressions. This may not come as a surprise, given that the expression on someone's face can be the biggest cue to perceiving emotion. However, the researchers in this study went further and not only looked at people's faces, but their posture and surrounding scenes as well.
A total of 80 undergraduate students — recruited from UC Merced and BYU — participated in the study. They were asked to look at images of people who had six different facial expressions: anger, disgust, fear, joy, sadness and neutral. A total of 24 postures were also depicted in the images. A variety of scenes ranging from a cemetery to a birthday party were utilized, too. Each facial expression, posture and scene was evaluated independently, then all were mixed to create various combinations. The students were asked to categorize the person's emotion in each image.
The research showed that the face is still a very strong cue to get a sense of a person's emotions. A little more than 60% of the emotion categorizations matched the emotion of the face. However, the other nearly 40% of responses were from participants who labeled an expression that was not in the face.
What this study found is that the posture and scene can change how people perceive facial expressions, even when the participants are directed to the face and asked about the emotion portrayed.
"When these cues are combined, it's something greater than just the sum of its parts," Walle said. "Emotions aren't just facial expressions, but they're also not just postures and not just scenes. The idea is that all of these features interact."
Moreover, for nearly 10% of the images, students looked at the composite scenes and reported seeing an emotion that was not in any of the individual cues; they saw an emotion that wasn't there. Walle explained that emotion perception is not an additive phenomenon but rather an emergent one. It's similar to the way hydrogen and oxygen function when they come together. Both are atoms but when combined they create something qualitatively distinct than if they remain independent.
This study was a major undertaking, and the researchers are already working on a subsequent investigation. Walle said participants were asked to look at confusable images and explain what they saw. Interestingly, respondents gave feedback based on what they perceived, such as the backstory of the person in the image and their thoughts.
"These narratives we make up when we see these emotional contexts, that's what we're trying to unpack now," Walle said. "There's more to it than just 'Is this person angry or disgusted?'"