Inscrutable
Neuroscientist Lisa Feldman Barrett comments in Scientific American about the supposed correlation between facial expressions and emotions.
In short, she argues that these correlations are unreliable. “In reality, people express emotion with tremendous variability. In anger, for example, people in urban cultures scowl only about 35 percent of the time….Scowls are also not specific to anger because people scowl for other reasons, such as when they are concentrating or when they have gas.”
Researchers often use AI to assess the emotional impact of video advertising via facial expressions. Barret argues that such techniques “do not detect emotions. They detect physical signals, such as facial movements, not the psychological meaning of those signals.”
Part of the issue is the variability that Barrett describes. Also, these tools cannot reveal what is causing a given facial expression. Imagine a political ad where Candidate X makes a scurrilous accusation about Candidate Y. A viewer’s face might evidence disgust. Even if you trust that expression, is the viewer disgusted by the scandalous behavior of Candidate Y, or disgusted that Candidate X would dare make such an accusation?
Also, the feelings associated with emotions are complex. (OZ’s Kathy Shaw recommends Brené Brown’s book, Atlas of the Heart, which discuss the taxonomy of emotional experiences.) You can experience shock, for example, very positively or very negatively. Anxiety and excitement feel the same in our bodies, we just assign them different labels. Nostalgia is often a mix of happiness and sadness. Boredom and regret can manifest themselves similarly on someone’s face.
Also, research suggests facial recognition software interprets assigns more negative emotions to Black faces than white faces.
It seems that if facial coding is used to understand advertising, it should be used as part of a much larger suite of tools that can reveal more about what those expressions really mean.