Research breakthrough possible @S-Logix pro@slogix.in

Office Address

  • 2nd Floor, #7a, High School Road, Secretariat Colony Ambattur, Chennai-600053 (Landmark: SRM School) Tamil Nadu, India
  • pro@slogix.in
  • +91- 81240 01111

Social List

An abstractive text summarization technique using transformer model with self-attention mechanism - 2023

an-abstractive-text-summarization-technique-using-transformer-model-with-self-attention-mechanism-neural-computing-and-applications.jpg

Research Paper on an Abstractive Text Summarization Technique Using Transformer Model with Self-Attention Mechanism

Research Area:  Machine Learning

Abstract:

Creating a summarized version of a text document that still conveys precise meaning is an incredibly complex endeavor in natural language processing (NLP). Abstract text summarization (ATS) is the process of using facts from source sentences and merging them into concise representations while maintaining the content and intent of the text. Manually summarizing large amounts of text are challenging and time-consuming for humans. Therefore, text summarization has become an exciting research focus in NLP. This research paper proposed an ATS model using a Transformer Technique with Self-Attention Mechanism (T2SAM). The self-attention mechanism is added to the transformer to solve the problem of coreference in text. This makes the system to understand the text better. The proposed T2SAM model improves the performance of text summarization. It is trained on the Inshorts News dataset combined with the DUC-2004 shared tasks dataset. The performance of the proposed model has been evaluated using the ROUGE metrics, and it has been shown to outperform the existing state-of-the-art baseline models. The proposed model gives the training loss minimum to 1.8220 from 10.3058 (at the starting point) up to 30 epochs, and it achieved model accuracy 48.50% F1-Score on both the Inshorts and DUC-2004 news datasets.

Keywords:  

Author(s) Name:  Sandeep Kumar

Journal name:  Neural Computing and Applications

Conferrence name:  

Publisher name:  Springer

DOI:  10.1007/s00521-023-08687-7

Volume Information:  volume 35,Pages18603-18622(2023)