Research Area:  Machine Learning
With the emerging deployment of deep neural networks, such as in mobile devices and autonomous cars, there is a growing demand for neural architecture search (NAS) to automatically design powerful network architectures. It is more reasonable to formulate NAS as a multi-objective optimization problem. In addition to prediction performance, multi-objective NAS (MONAS) problems take into account other criteria like the number of parameters and inference latency. Multi-objective evolutionary algorithms (MOEAs) are the preferred approach for tackling MONAS due to their effectiveness in dealing with multi-objective optimization problems. Recently, local search-based NAS algorithms have demonstrated their efficiency over MOEAs for MONAS problems. However, their performance has been only verified on bi-objective NAS problems. In this article, we propose a local search algorithm for multi-objective NAS (LOMONAS), an efficient local search framework for solving not only bi-objective NAS problems but also NAS problems having more than two objectives. We additionally present a parameter-less version of LOMONAS, namely IMS-LOMONAS, by combining LOMONAS with the Interleaved Multi-start Scheme (IMS) to help NAS practitioners avoid manual control parameter settings. Experimental results from a series of benchmark problems in the CEC’23 Competition demonstrate the competitiveness of LOMONAS and IMS-LOMONAS compared to MOEAs in tackling MONAS within both small-scale and large-scale search spaces.
Keywords:  
Author(s) Name:  Quan Minh Phan , Ngoc Hoang Luong
Journal name:  Swarm and Evolutionary Computation
Conferrence name:  
Publisher name:  ScienceDirect
DOI:  10.1016/j.swevo.2024.101573
Volume Information:  Volume 87, (2024)
Paper Link:   https://www.sciencedirect.com/science/article/abs/pii/S2210650224001111