Research Area:  Machine Learning
Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve performance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This motivates research into efficient methods that require fewer resources to achieve similar results. This survey synthesizes and relates current methods and findings in efficient NLP. We aim to provide both guidance for conducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.
Keywords:  
Natural language processing
Data
Time
Storage
Energy
Author(s) Name:  Marcos Treviso, Ji-Ung Lee, Tianchu Ji, Betty van Aken, Qingqing Cao, Manuel R. Ciosici, Michael Hassid, Kenneth Heafield, Sara Hooker, Colin Raffel, Pedro H. Martins, AndrĂ© F. T. Martins, Jessica Zosa Forde, Peter Milder, Edwin Simpson, Noam Slonim, Jesse Dodge, Emma Strubell, Niranjan Balasubramanian, Leon Derczynski, Iryna Gurevych, Roy Schwartz
Journal name:  Transactions of the Association for Computational Linguistics
Conferrence name:  
Publisher name:  MIT Press
DOI:  10.1162/tacl_a_00577
Volume Information:  Volume 11
Paper Link:   https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00577/116725