Preprint: Temporal processing of facial expressions of mental states

I got the pleasure of working with a small team of researchers on trying to understand human emotion recognition. And specifically the different types of cognition happening through the visual system. You can read our preprint on bioRxiv.

Authors: Gunnar Schmidtmann, Maiya Jordan, Joshua T Loong, Andrew J Logan, Claus-Christian Carbon, Ian Gold

Abstract

Faces provide not only cues to an individual’s identity, age, gender and ethnicity, but also insight into their mental states. The ability to identify the mental states of others is known as Theory of Mind. Here we present results from a study aimed at extending our understanding of differences in the temporal dynamics of the recognition of expressions beyond the basic emotions at short presentation times ranging from 12.5 to 100 ms. We measured the effect of variations in presentation time on identification accuracy for 36 different facial expressions of mental states based on the Reading the Mind in the Eyes test (Baron-Cohen et al., 2001) and compared these results to those for corresponding stimuli from the McGill Face database, a new set of images depicting mental states portrayed by professional actors. Our results show that subjects are able to identify facial expressions of complex mental states at very brief presentation times. The kind of cognition involved in the correct identification of facial expressions of complex mental states at very short presentation times suggests a fast, automatic Type-1 cognition.

Comments powered by Talkyard.