Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Attentive convolutional gated recurrent network: a contextual model to sentiment analysis - 2020

Attentive Convolutional Gated Recurrent Network: A Contextual Model To Sentiment Analysis

Research Area:  Machine Learning

Abstract:

Considering contextual features is a key issue in sentiment analysis. Existing approaches including convolutional neural networks (CNNs) and recurrent neural networks (RNNs) lack the ability to account and prioritize informative contextual features that are necessary for better sentiment interpretation. CNNs present limited capability since they are required to be very deep, which can lead to the gradient vanishing whereas, RNNs fail because they sequentially process input sequences. Furthermore, the two approaches treat all words equally. In this paper, we suggest a novel approach named attentive convolutional gated recurrent network (ACGRN) that alleviates the above issues for sentiment analysis. The motivation behind ACGRN is to avoid the vanishing gradient caused by deep CNN via applying a shallow-and-wide CNN that learns local contextual features. Afterwards, to solve the problem caused by the sequential structure of RNN and prioritizing informative contextual information, we use a novel prior knowledge attention based bidirectional gated recurrent unit (ATBiGRU). Prior knowledge ATBiGRU captures global contextual features with a strong focus on the previous hidden states that carry more valuable information to the current time step. The experimental results show that ACGRN significantly outperforms the baseline models over six small and large real-world datasets for the sentiment classification task.

Keywords:  

Author(s) Name:  Olivier Habimana, Yuhua Li, Ruixuan Li, Xiwu Gu & Wenjin Yan

Journal name:  International Journal of Machine Learning and Cybernetics

Conferrence name:  

Publisher name:  Springer

DOI:  10.1007/s13042-020-01135-1

Volume Information:  volume 11, pages 2637–2651 (2020)