Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Hierarchical multi-attention networks for document classification - 2021

Research Area:  Machine Learning

Abstract:

Research of document classification is ongoing to employ the attention based-deep learning algorithms and achieves impressive results. Owing to the complexity of the document, classical models, as well as single attention mechanism, fail to meet the demand of high-accuracy classification. This paper proposes a method that classifies the document via the hierarchical multi-attention networks, which describes the document from the word-sentence level and the sentence-document level. Further, different attention strategies are performed on different levels, which enables accurate assigning of the attention weight. Specifically, the soft attention mechanism is applied to the word-sentence level while the CNN-attention to the sentence-document level. Due to the distinctiveness of the model, the proposed method delivers the highest accuracy compared to other state-of-the-art methods. In addition, the attention weight visualization outcomes present the effectiveness of attention mechanism in distinguishing the importance.

Author(s) Name:  Yingren Huang, Jiaojiao Chen, Shaomin Zheng, Yun Xue & Xiaohui Hu

Journal name:  International Journal of Machine Learning and Cybernetics

Conferrence name:  

Publisher name:  Springer

DOI:  10.1007/s13042-020-01260-x

Volume Information:  volume 12, pages 1639–1647 (2021)