Facial microexpressions: what they are and what they tell us

Share on

Is the study of microexpressions a useful tool in listening to people?

Facial microexpressions are very short-lived facial expressions that can provide information about basic emotions, when associated with other physiological responses. This quantitative measurement falls into the realm of what we listen to – thanks to applied neuroscience – on a psychophysiological level. Together with sweating, brain waves, eye movements and other signals, we can investigate the unconscious and our emotions in relation to different types of experience, whether they are related to a product, a service, or a moment. We also combine this type of listening with more traditional listening: gathering valuable information from what people perceive and have to say about an experience.

Starting from these assumptions, it is important to ask ourselves how to include the study of these psychophysiological responses in our listening method. Today we ask ourselves: what value do facial microexpressions have? What information do they give us?

But let’s start with a broader excursus in order to get an answer to these questions.

What are facial micro-expressions

Facial expressions are an integral part of body language, the main component of non-verbal expression which, according to linguists, characterizes more than 90% of our actual communication. If they simply provided redundant information compared to verbal language, they would not be studied or used in research in scientific fields such as neuroscience, psychology and behavioral sciences1.

Today we know which of the cerebral areas can send electrical impulses to the face, in order to contract or relax the facial muscles and we classify, in particular, two levels of expressions:

  • Some are conscious, such as those produced by impulses sent from the motor cortex and thus resulting from voluntary efforts to reproduce a particular expression;
  • Others are involuntary, out of the control of the person: the result of impulses sent from lower areas of the brain, following an emotional stimulation.

Since many facial expressions are difficult to modulate voluntarily, they provide us valuable information about the emotional sphere of our interlocutor.
The scientific study of the expression-emotion relationship began with Charles Darwin’s The Expression of Emotions in Man and Animals2, which documented a series of emotions capable of arousing universal expressions in humans and other animals.

Among the first to demonstrate how facial expressions could be used to obtain information other than language and related to the emotional sphere, there were Paul Ekman and Wallace V. Friesen. In particular, they defined facial microexpressions as manifestations characterized by complete muscular movements, typical of macro ones, but much more rapid3 and consequently more difficult to control voluntarily: the duration of microexpressions oscillates in fact in an interval between 1/15 and 1/25 of a second. Paul Ekman identified six primary emotional macrocategories, that is, innate, universal, and manifested through the face, postural reactions and voice parameters:

  • happiness
  • fear
  • anger
  • disgust
  • sadness
  • surprise

The six basic emotions relating to facial microexpressions

Can facial micro-expressions be read?
The Facial Action Coding System

The first to classify facial expressions into a series of patterns of muscle contractions was the Swedish anatomist Carl-Herman Hjortsjö4. Later, they were recoded by Paul Ekman and Wallace V. Friesen who tried to provide a more solid basis of what the different facial actions could mean, creating the Facial Action Coding System (FACS). FACS is still the most comprehensive, psychometrically rigorous, and widely used system for describe facial activity in terms of action units (AU) or basic actions of individual or group muscles 5. AUs are identified by a number, they include the anatomical basis of each action, and they are rated on a 5-point intensity scale:

  • A = track
  • B = mild
  • C = marked or pronounced
  • D = severe or extreme
  • E = maximum

Although the Facial Action Coding System does not include specific descriptors for emotions, it is commonly used to interpret the link between nonverbal communicative signals, given by facial expressions, and emotions or other human states6; however, there are specific related resources, such as EMFACS (emotional FACS), the FACS Investigators’ Guide 7, the FACSAID8 interpretive database, used to make inferences about expressed emotions by considering single AUs or combinations thereof. A clear limit of FACS, however, is that the assessment of facial movements fails to account for other facial changes or phenomena such as:

  • changes in muscle tone
  • sweating
  • change in skin coloration

The study of facial microexpressions in response to a commercial stimulus

Experiences generate emotions that shape people’s attitudes towards brands and products9, just as they determine the effectiveness of advertisements10. It is therefore extremely important to acquire and being able to measure information about the emotional reactions of those who see television commercials, videos on social networks or any other form of product or brand communication. From this point of view, “neuromarketing”, or more correctly, neuroscience applied to marketing, seeks to answer a question: are there methods that provide quantitative answers and objective indicators in this regard?

Among the physiological indicators acquired to investigate the emotional state of a subject, we can consider the heart rate (Heart rate, HR), the electrodermal response (Galvanic skin response, GSR) or regional brain responses (electroencephalography, EEG). This is where the facial action coding system comes in, which has become increasingly popular in recent years, also in marketing environment.

In this regard, however, several issues have emerged to date related to the exclusive use of FACS to measure a person’s experience of purchase, use or relationship with a brand or product. The critical issues intrinsically linked to this technique, are:

  • the discrete character in time: what we analyze with FACS are facial expressions characterizing the person’s face in single instants. In this way, the dynamics of the reaction being observed cannot be captured;
  • the ambiguity of some mimic patterns: the patterns of muscular contractions classified are ambiguous because they are not all univocally 11linked to a particular emotion.

Further undermining its objectivity and validity are other factors:

  • the awareness on the part of the participants that their facial expressions will be recorded;
  • the formation of the encoder and/or the encoding system for such expressions;
  • the validation of the encoder or automatic encoding system’s12

Whether FACS is applied manually or automatically in fact, the facial behavior recorded on video must be analyzed frame by frame. In conclusion, we can affirm that to analyze an advertisement or a product with a stimulation capacity of emotional response still undefined or unknown, the only use of FACS is not sufficient, but must be validated by further neurophysiological indexes12.

User test with FACS application, for reading facial micro-expressions

Books on facial microexpressions and curiosities

To learn more and read about facial microexpressions directly from the person who defined them, we suggest some of the popular books written by Paul Ekman such as Telling Lies, Emotion in the Human face and Emotions Revealed, as well as consulting the PaulEkmanGroup website.
Precise and engaging is also the result of Ekman’s contribution to the Disney-Pixar film Inside out. From his collaboration with the director Pete Docter, in fact, were born the main characters, that is the primary emotions of the child protagonist, whose expressions are punctually based on those codified in FACS and that accompany the spectator to the discovery of his own emotional world.


Bibliographic references

[1] Cohn., J. F. et at., 2007, Observer-based measurement of facial expression with the facial action coding system.

[2] Darwin, C. 1998. The Expression of the Emotions in Man and Animals, 3rd edit. Introduction, afterwords, and commentaries by Paul Ekman. Harper Collins. London (US edit.: Oxford University Press. New York).

[3] Ekman, P., Friesen, W. V., 1969. Nonverbal Leakage and Clues to Deception.

[4] Hjortsjö, Carl-Herman. 1969. Man’s face and mimic language. Sweden: Studentlitteratur Lund.

[5] Clark, E. A. et al., 2020, The Facial Action Coding System for Characterization of Human Affective Response to Consumer Product-Based Stimuli: A Systematic Review.

[6] Valstar, M. F., and Pantic, M., 2006, Biologically vs. logic inspired encoding of facial actions and emotions in video.

[7] Ekman P. et al., 2002, The Facial Action Coding System: A Technique for the Measurement of Facial Movement.

[8] Ekman, P. et al., 1998, Facial Action Coding System Interpretive Database (FACSAID).

[9] Bagozzi, R. P. et al., 1998, Goal-Directed Emotions.

[10] Percy, L., Rossiter, J. R., 2001, Strategic Advertising Management.

[11] Greppel-Klein. A. et al., 2010, Measurement of Emotions Elicited by Advertising.

[12] Clark, E. A. et al., 2020, The Facial Actin Coding System for Characterization of Human Affective Response to Consumer Product-Based Stimuli: A Systematic Review.

Share on
29 November 2021 Alessandra Spina

Related articles:

TAG: UX and UI