Observer-Generated Maps of Diagnostic Facial Features Allow to Categorize Expressions of Emotion
Wed-P12-Poster III-202
Presented by: Martin Wegrzyn
Objective: Facial expressions of emotion can be reliably categorized into happiness, disgust, anger, sadness, fear and surprise. For each expression, a different set of Action Units (AUs) is assumed to be engaged. We set out to create maps of diagnostic facial features, by analyzing which parts of the face observers highlight as useful.
Methods: 202 participants were asked to categorize facial expressions and to highlight diagnostic areas by clicking on them. The faces were show either fully or masked above/below the nostrils. Patterns of highlighted face parts in full faces and the number of correct answers in masked faces were analyzed.
Results: When highlighting face parts, observers showed an overall tendency to focus on the eyes and mouth regions. However, the highlighted patterns for each expression deviated from this global pattern in a unique way, revealing maps of diagnostic regions. These observer-generated maps of facial features allowed to accurately predict which expression a participant had seen. When parcellating the maps by AUs, those AUs which are considered most important for expressing an emotion were the ones which received the highest number of clicks.
Conclusions: The study confirms the critical role of specific parts of the face for recognizing each emotion. The patterns of diagnostic features which the observers generated by clicking onto the faces allowed for consistent above-chance predictions of the actual expression of the face and included key AUs. These distinct observer-generated patterns can allow to generate ideas about how the task of categorizing facial expressions is performed.
Keywords: Faces, Facial Expressions, Face processing, Face recognition, Categorization