PlumX Metrics
Embed PlumX Metrics

Affective prosody guides facial emotion processing

Current Psychology, ISSN: 1936-4733, Vol: 42, Issue: 27, Page: 23891-23902
2023
  • 1
    Citations
  • 0
    Usage
  • 7
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Article Description

Previous studies have reported the “emotional congruency effect (ECE)” in cross-modal emotion processing, claiming that multimodal congruent emotional signals will enhance the emotion processing, yet few studies have shown how this effect is dynamically processed over time and whether it is achieved in the same way across language and cultural backgrounds. We adopted the eye-tracking technique to investigate whether and how the audio emotional signal influences the visual processing of emotional faces according to ECE. We explored this issue by asking thirty-two native Mandarin speakers to scan a visual array of four types of emotional faces while listening to the affective prosody matching one of the four emotions. To eliminate the potential confounding from lexico-semantic information, the affective prosody is pronounced in meaningless di-syllable clusters. Results of the experiment indicate that (1) participants paid more attention to happy faces at first glance and their attention shifted to angry and sad faces over time. (2) Consistent with findings in English-speaking settings, ECE appeared in Mandarin-speaking settings, but took effect earlier in happy faces and persisted in all emotions as the unfolding of the signal. Based on the results, we conclude that the processing time differs across emotion types and therefore ECE takes effect in different temporal points according to the emotion type. Finally, we suggest that language and cultural experience may shape the processing time of different emotions.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know