Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Efficient Transformers:A Survey - 2020

Efficient Transformers:A Survey

Research Area:  Machine Learning

Abstract:

Transformer model architectures have garnered immense interest lately due to their effectiveness across a range of domains like language, vision and reinforcement learning. In the field of natural language processing for example, Transformers have become an indispensable staple in the modern deep learning stack. Recently, a dizzying number of "X-former" models have been proposed - Reformer, Linformer, Performer, Longformer, to name a few - which improve upon the original Transformer architecture, many of which make improvements around computational and memory efficiency. With the aim of helping the avid researcher navigate this flurry, this paper characterizes a large and thoughtful selection of recent efficiency-flavored "X-former" models, providing an organized and comprehensive overview of existing work and models across multiple domains.

Keywords:  

Author(s) Name:  Yi Tay, Mostafa Dehghani, Dara Bahri, Donald Metzler

Journal name:  Computer Science

Conferrence name:  

Publisher name:  arXiv:2009.06732

DOI:  10.48550/arXiv.2009.06732

Volume Information: