List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Efficient Neural Architecture Search via Proximal Iterations - 2019

efficient-neural-architecture-search-via-proximal-iterations.jpg

Efficient Neural Architecture Search via Proximal Iterations | S-Logix

Research Area:  Machine Learning

Abstract:

Neural architecture search (NAS) attracts much research attention because of its ability to identify better architectures than handcrafted ones. Recently, differentiable search methods become the state-of-the-arts on NAS, which can obtain high-performance architectures in several days. However, they still suffer from huge computation costs and inferior performance due to the construction of the supernet. In this paper, we propose an efficient NAS method based on proximal iterations (denoted as NASP). Different from previous works, NASP reformulates the search process as an optimization problem with a discrete constraint on architectures and a regularizer on model complexity. As the new objective is hard to solve, we further propose an efficient algorithm inspired by proximal iterations for optimization. In this way, NASP is not only much faster than existing differentiable search methods, but also can find better architectures and balance the model complexity. Finally, extensive experiments on various tasks demonstrate that NASP can obtain high-performance architectures with more than 10 times speedup over the state-of-the-arts.

Keywords:  
Neural architecture search
NASP
Proximal Iterations
NAS method
Model complexity

Author(s) Name:   Quanming Yao , Ju Xu , Wei-Wei Tu

Journal name:  

Conferrence name:   Proceedings of the AAAI Conference on Artificial Intelligence

Publisher name:  Association for the Advancement of Artificial Intelligence

DOI:  10.1609/aaai.v34i04.6143

Volume Information:  Volume 14