Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

ASRNN: a recurrent neural network with an attention model for sequence labeling - 2021

Asrnn: A Recurrent Neural Network With An Attention Model For Sequence Labeling

Research Area:  Machine Learning

Abstract:

Natural language processing (NLP) is useful for handling text and speech, and sequence labeling plays an important role by automatically analyzing a sequence (text) to assign category labels to each part. However, the performance of these conventional models depends greatly on hand-crafted features and task-specific knowledge, which is a time consuming task. Several conditional random fields (CRF)-based models for sequence labeling have been presented, but the major limitation is how to use neural networks for extracting useful representations for each unit or segment in the input sequence. In this paper, we propose an attention segmental recurrent neural network (ASRNN) that relies on a hierarchical attention neural semi-Markov conditional random fields (semi-CRF) model for the task of sequence labeling. Our model uses a hierarchical structure to incorporate character-level and word-level information and applies an attention mechanism to both levels. This enables our method to differentiate more important information from less important information when constructing the segmental representation. We evaluated our model on three sequence labeling tasks, including named entity recognition (NER), chunking, and reference parsing. Experimental results show that the proposed model benefited from the hierarchical structure, and it achieved competitive and robust performance on all three sequence labeling tasks.

Keywords:  

Author(s) Name:  Jerry Chun Wei Lin,Yinan Shao,Youcef Djenouri,Unil Yun

Journal name:  Knowledge-Based Systems

Conferrence name:  

Publisher name:  Elsevier

DOI:  10.1016/j.knosys.2020.106548

Volume Information:  Volume 212, 5 January 2021, 106548