What happens when we judge a face? Yu et al. (2023) review how ambiguity, emotion, and personality shape our interpretation — across brains and modalities.
How do we decode a face? In a comprehensive review published in the Annals of the New York Academy of Sciences, Yu et al. (2023) synthesize multimodal research on how humans process emotional expressions and judge social traits in faces. The review integrates data from EEG, fMRI, single-neuron recordings, and personality-based assessments to capture the complexity of facial perception.
Two focal areas structure the work: emotion ambiguity and social trait evaluation. For ambiguous emotional expressions, the authors identify a distinct event-related potential (ERP) signature and show that both typical and autistic individuals exhibit unique neural responses, revealing deeper cognitive mechanisms behind social interpretation. For social traits, findings from imaging and trait-based dimensions show how people infer characteristics like trustworthiness, dominance, or warmth — often within milliseconds.
What sets this review apart is its integration of findings across neural levels — from population-wide brain imaging to single-neuron resolution — and its comparative lens on neurotypical versus autistic perception. The paper also advocates for computational modeling to bridge behavioral observations with neural correlates, proposing new best practices for future research on human social cognition.
As AI, social robotics, and dog-human communication research grow more nuanced, this review may hold relevance far beyond human faces alone — particularly in designing emotionally aware systems and decoding cross-species emotional cues.
Multimodal investigations of emotional face processing and social trait judgment of faces.
Published in Annals of the NY Academy of Sciences, November 2023