List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Mngnas: distilling adaptive combination of multiple searched networks for one-shot neural architecture search - 2023

distilling-adaptive-combination.png

Research Paper On Mngnas: distilling adaptive combination of multiple searched networks for one-shot neural architecture search

Research Area:  Machine Learning

Abstract:

Recently neural architecture (NAS) search has attracted great interest in academia and industry. It remains a challenging problem due to the huge search space and computational costs. Recent studies in NAS mainly focused on the usage of weight sharing to train a SuperNet once. However, the corresponding branch of each subnetwork is not guaranteed to be fully trained. It may not only incur huge computation costs but also affect the architecture ranking in the retraining procedure. We propose a multi-teacher-guided NAS, which proposes to use the adaptive ensemble and perturbation-aware knowledge distillation algorithm in the one-shot-based NAS algorithm. The optimization method aiming to find the optimal descent directions is used to obtain adaptive coefficients for the feature maps of the combined teacher model. Besides, we propose a specific knowledge distillation process for optimal architectures and perturbed ones in each searching process to learn better feature maps for later distillation procedures. Comprehensive experiments verify our approach is flexible and effective. We show improvement in precision and search efficiency in the standard recognition dataset. We also show improvement in correlation between the accuracy of the search algorithm and true accuracy by NAS benchmark datasets.

Keywords:  

Author(s) Name:  Zhihua Chen, Guhao Qiu, Ping Li, Lei Zhu, Xiaokang Yang, Bin Shen,

Journal name:  IEEE Transactions on Pattern Analysis and Machine Intelligence

Conferrence name:  

Publisher name:  IEEE

DOI:  10.1109/TPAMI.2023.3293885

Volume Information:  Volume: 45,Pages: 13489 - 13508,(2023)