Research breakthrough possible @S-Logix pro@slogix.in

Office Address

  • 2nd Floor, #7a, High School Road, Secretariat Colony Ambattur, Chennai-600053 (Landmark: SRM School) Tamil Nadu, India
  • pro@slogix.in
  • +91- 81240 01111

Social List

A Fusion Model-Based Label Embedding and Self-Interaction Attention for Text Classification - 2019

A Fusion Model-Based Label Embedding And Self-Interaction Attention For Text Classification

Research Area:  Machine Learning

Abstract:

Text classification is a pivotal task in NLP (Natural Language Processing), which has received widespread attention recently. Most of the existing methods leverage the power of deep learning to improve the performance of models. However, these models ignore the interaction information between all the sentences in a text when generating the current text representation, which results in a partial semantics loss. Labels play a central role in text classification. And the attention learned from text-label in the joint space of labels and words is not leveraged, leaving enough room for further improvement. In this paper, we propose a text classification method based on Self-Interaction attention mechanism and label embedding. Firstly, our method introduce BERT (Bidirectional Encoder Representation from Transformers) to extract text features. Then Self-Interaction attention mechanism is employed to obtain text representations containing more comprehensive semantics. Moreover, we focus on the embedding of labels and words in the joint space to achieve the dual-label embedding, which further leverages the attention learned from text-label. Finally, the texts are classified by the classifier according to the weighted labels representations. The experimental results show that our method outperforms other state-of-the-art methods in terms of classification accuracy.

Keywords:  

Author(s) Name:  Yanru Dong; Peiyu Liu; Zhenfang Zhu; Qicai Wang; Qiuyue Zhang

Journal name:  IEEE Access

Conferrence name:  

Publisher name:  IEEE

DOI:  10.1109/ACCESS.2019.2954985

Volume Information:  ( Volume: 8) Page(s): 30548 - 30559