Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

EmoSen: Generating Sentiment and Emotion Controlled Responses in a Multimodal Dialogue System - 2020

EmoSen: Generating Sentiment and Emotion Controlled Responses in a Multimodal Dialogue System

Research paper on EmoSen: Generating Sentiment and Emotion Controlled Responses in a Multimodal Dialogue System

Research Area:  Machine Learning

Abstract:

An essential skill for effective communication is the ability to express specific sentiment and emotion in a conversation. Any robust dialogue system should handle the combined effect of both sentiment and emotion while generating responses. This is expected to provide a better experience and concurrently increase users’ satisfaction. Previously, research on either emotion or sentiment controlled dialogue generation has shown great promise in developing the next generation conversational agents, but the simultaneous effect of both is still unexplored. The existing dialogue systems are majorly based on unimodal sources, predominantly the text, and thereby cannot utilize the information present in the other sources, such as video, audio, image, etc. In this article, we present at first a large scale benchmark Sentiment Emotion aware Multimodal Dialogue (SEMD) dataset for the task of sentiment and emotion controlled dialogue generation. The SEMD dataset consists of 55k conversations from 10 TV shows having text, audio, and video information. To utilize multimodal information, we propose multimodal attention based conditional variational autoencoder (M-CVAE) that outperforms several baselines. Quantitative and qualitative analyses show that multimodality, along with contextual information, plays an essential role in generating coherent and diverse responses for any given emotion and sentiment.

Keywords:  
Conversational AI
natural language generation
sentiment-aware NLG
emotion-aware NLG
multimodality

Author(s) Name:  Mauajama Firdaus; Hardik Chauhan; Asif Ekbal; Pushpak Bhattacharyya

Journal name:  IEEE Transactions on Affective Computing

Conferrence name:  

Publisher name:  IEEE

DOI:   10.1109/TAFFC.2020.3015491

Volume Information:  Volume: 13, Issue: 3, 01 July-Sept. 2022,Page(s): 1555 - 1566