Research Area:  Machine Learning
Emotion recognition in conversations is crucial for building empathetic machines. Present works in this domain do not explicitly consider the inter-personal influences that thrive in the emotional dynamics of dialogues. To this end, we propose Interactive COnversational memory Network (ICON), a multimodal emotion detection framework that extracts multimodal features from conversational videos and hierarchically models the self- and inter-speaker emotional influences into global memories. Such memories generate contextual summaries which aid in predicting the emotional orientation of utterance-videos. Our model outperforms state-of-the-art networks on multiple classification and regression tasks in two benchmark datasets.
Interactive Conversational Memory Network
Multimodal Emotion Detection
Author(s) Name:  Devamanyu Hazarika, Soujanya Poria, Rada Mihalcea, Erik Cambria, Roger Zimmermann
Journal name:  Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Publisher name:  ACL
Paper Link:   https://aclanthology.org/D18-1280/