Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Learning Word Embeddings: Unsupervised Methods for Fixed-size Representations of Variable-length Speech Segments - 2018

Learning Word Embeddings: Unsupervised Methods For Fixed-Size Representations Of Variable-Length Speech Segments

Research Paper on Learning Word Embeddings: Unsupervised Methods For Fixed-Size Representations Of Variable-Length Speech Segments

Research Area:  Machine Learning

Abstract:

Fixed-length embeddings of words are very useful for a variety of tasks in speech and language processing. Here we systematically explore two methods of computing fixed-length embeddings for variable-length sequences. We evaluate their susceptibility to phonetic and speaker-specific variability on English, a high resource language and Xitsonga, a low resource language, using two evaluation metrics: ABX word discrimination and ROC-AUC on same-different phoneme n-grams. We show that a simple downsampling method supplemented with length information can outperform the variable-length input feature representation on both evaluations. Recurrent autoencoders, trained without supervision, can yield even better results at the expense of increased computational complexity.

Keywords:  
Learning Word Embeddings
Unsupervised
Machine Learning
Deep Learning

Author(s) Name:  Nils Holzenberger , Mingxing Du ,Julien Karadayi,Rachid Riad,Emmanuel Dupoux

Journal name:  

Conferrence name:  Cognitive science

Publisher name:  HAL

DOI:  10.21437/Interspeech.2018-2364

Volume Information: