Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

A primer in bertology:What we know about how bert works - 2020

Research Area:  Machine Learning

Abstract:

Transformer-based models have pushed state of the art in many areas of NLP, but our understanding of what is behind their success is still limited. This paper is the first survey of over 150 studies of the popular BERT model. We review the current state of knowledge about how BERT works, what kind of information it learns and how it is represented, common modifications to its training objectives and architecture, the overparameterization issue, and approaches to compression. We then outline directions for future research.

Author(s) Name:  Anna Rogers, Olga Kovaleva, Anna Rumshisky

Journal name:  Transactions of the Association for Computational Linguistics

Conferrence name:  

Publisher name:  MIT PRESS

DOI:  10.1162/tacl_a_00349

Volume Information:  Volume 8, Pages: 842–866.