Research Area:  Machine Learning
Over the past half-decade, many methods have been considered for neural architecture search (NAS). Bayesian optimization (BO), which has long had success in hyperparameter optimization, has recently emerged as a very promising strategy for NAS when it is coupled with a neural predictor. Recent work has proposed different instantiations of this framework, for example, using Bayesian neural networks or graph convolutional networks as the predictive model within BO. However, the analyses in these papers often focus on the full-fledged NAS algorithm, so it is difficult to tell which individual components of the framework lead to the best performance. In this work, we give a thorough analysis of the "BO + neural predictor framework" by identifying five main components: the architecture encoding, neural predictor, uncertainty calibration method, acquisition function, and acquisition function optimization. We test several different methods for each component and also develop a novel path-based encoding scheme for neural architectures, which we show theoretically and empirically scales better than other encodings. Using all of our analyses, we develop a final algorithm called BANANAS, which achieves state-of-the-art performance on NAS search spaces.
Keywords:  
Hyperparameter Tuning
Algorithm Configuration
BANANAS
Bayesian optimization
Neural architecture search
Author(s) Name:   Colin White , Willie Neiswanger , Yash Savani
Journal name:  
Conferrence name:  Proceedings of the AAAI Conference on Artificial Intelligence
Publisher name:  AAAI
DOI:  10.1609/aaai.v35i12.17233
Volume Information:   Volume 35
Paper Link:   https://ojs.aaai.org/index.php/AAAI/article/view/17233