Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Efficient Conformer with Prob-Sparse Attention Mechanism for End-to-EndSpeech Recognition - 2021


Prob-Sparse Attention Mechanism for End-to-End Speech Recognition | S-Logix

Research Area:  Machine Learning

Abstract:

End-to-end models are favored in automatic speech recognition (ASR) because of their simplified system structure and superior performance. Among these models, Transformer and Conformer have achieved state-of-the-art recognition accuracy in which self-attention plays a vital role in capturing important global information. However, the time and memory complexity of self-attention increases squarely with the length of the sentence. In this paper, a prob-sparse self-attention mechanism is introduced into Conformer to sparse the computing process of self-attention in order to accelerate inference speed and reduce space consumption. Specifically, we adopt a Kullback-Leibler divergence based sparsity measurement for each query to decide whether we compute the attention function on this query. By using the prob-sparse attention mechanism, we achieve impressively 8% to 45% inference speed-up and 15% to 45% memory usage reduction of the self-attention module of Conformer Transducer while maintaining the same level of error rate.

Keywords:  
automatic speech recognition
self-attention
global information
prob-sparse mechanism
space consumption
error rate

Author(s) Name:  Xiong Wang, Sining Sun, Lei Xie, Long Ma

Journal name:  Sound

Conferrence name:  

Publisher name:  arXiv

DOI:  https://doi.org/10.48550/arXiv.2106.09236

Volume Information:  Volume 1