Research Area:  Machine Learning
Effectively scaling large Transformer models is a main driver of recent advances in natural language processing. Dynamic neural networks, as an emerging research direction, are capable of scaling up neural networks with sub-linear increases in computation and time by dynamically adjusting their computational path based on the input. Dynamic neural networks could be a promising solution to the growing parameter numbers of pretrained language models, allowing both model pretraining with trillions of parameters and faster inference on mobile devices. In this survey, we summarize progress of three types of dynamic neural networks in NLP: skimming, mixture of experts, and early exit. We also highlight current challenges in dynamic neural networks and directions for future research.
Author(s) Name:  Canwen Xu, and Julian McAuley
Journal name:  arXiv-CS
Publisher name:  arXiv-CS
Volume Information:  arXiv:2202.07101
Paper Link:   https://arxiv.org/abs/2202.07101