🤖 AI Summary
This study investigates the mechanisms underlying emotional expression and perception in musical performance, examining how repertoire type, tonal practice, and improvisation—along with expressive proficiency—affect performers’ emotional transmission and listeners’ responses. Employing a multimodal approach, we integrated acoustic analysis of audio recordings, subjective emotion labeling, and neurophysiological measurements (EEG and heart rate variability, HRV), while developing a cross-subject emotional consistency evaluation framework. Results reveal, for the first time, that improvisational performance exhibits a distinct acoustic signature associated with significantly enhanced emotional resonance in listeners and robust neurophysiological relaxation effects—including increased alpha-band power and elevated HRV—effects also observed in highly expressive (non-improvised) performances. These findings establish expressivity as a critical determinant of music’s emotional communicative efficacy and provide novel empirical evidence for the neural underpinnings of emotional processing in improvisational music.
📝 Abstract
This study investigates emotional expression and perception in music performance using computational and neurophysiological methods. The influence of different performance settings, such as repertoire, diatonic modal etudes, and improvisation, as well as levels of expressiveness, on performers' emotional communication and listeners' reactions is explored. Professional musicians performed various tasks, and emotional annotations were provided by both performers and the audience. Audio analysis revealed that expressive and improvisational performances exhibited unique acoustic features, while emotion analysis showed stronger emotional responses. Neurophysiological measurements indicated greater relaxation in improvisational performances. This multimodal study highlights the significance of expressivity in enhancing emotional communication and audience engagement.