Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

AttentionRank: Unsupervised Keyphrase Extraction using Self and Cross Attentions - 2021

AttentionRank: Unsupervised Keyphrase Extraction using Self and Cross Attentions

Research Paper on AttentionRank: Unsupervised Keyphrase Extraction using Self and Cross Attentions

Research Area:  Machine Learning

Abstract:

Keyword or keyphrase extraction is to identify words or phrases presenting the main topics of a document. This paper proposes the Attention Rank, a hybrid attention model, to identify keyphrases from a document in an unsupervised manner. Attention Rank calculates self-attention and cross-attention using a pre-trained language model. The self-attention is designed to determine the importance of a candidate within the context of a sentence. The cross-attention is calculated to identify the semantic relevance between a candidate and sentences within a document. We evaluate the Attention Rank on three publicly available datasets against seven baselines. The results show that the Attention Rank is an effective and robust unsupervised keyphrase extraction model on both long and short documents.

Keywords:  
Keyphrase extraction
Pre-trained language model
AttentionRank
Machine Learning
Deep Learning

Author(s) Name:   Haoran Ding, Xiao Luo

Journal name:  

Conferrence name:  Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

Publisher name:  Association for Computational Linguistics

DOI:  10.18653/v1/2021.emnlp-main.146

Volume Information: