Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Understanding Word Embeddings and Language Models - 2020

Understanding Word Embeddings And Language Models

Research Area:  Machine Learning

Abstract:

Early word embeddings algorithms like word2vec and GloVe generate static distributional representations for words regardless of the context and the sense in which the word is used in a given sentence, offering poor modeling of ambiguous words and lacking coverage for out-of-vocabulary words. Hence a new wave of algorithms based on training language models such as Open AI GPT and BERT has been proposed to generate contextual word embeddings that use as input word constituents allowing them to generate representations for out-of-vocabulary words by combining the word pieces. Recently, fine-tuning pre-trained language models that have been trained on large corpora have constantly advanced the state of the art for many NLP tasks.

Keywords:  

Author(s) Name:  Jose Manuel Gomez-Perez, Ronald Denaux, Andres Garcia-Silva

Journal name:  A Practical Guide to Hybrid Natural Language Processing-Book

Conferrence name:  

Publisher name:  Springer Nature Switzerland

DOI:  https://doi.org/10.1007/978-3-030-44830-1_3

Volume Information:  pp 17-31