Research Area:  Machine Learning
Machine learning algorithms have been used widely in various applications and areas. To fit a machine learning model into different problems, its hyper-parameters must be tuned. Selecting the best hyper-parameter configuration for machine learning models has a direct impact on the model’s performance. It often requires deep knowledge of machine learning algorithms and appropriate hyper-parameter optimization techniques. Although several automatic optimization techniques exist, they have different strengths and drawbacks when applied to different types of problems. In this paper, optimizing the hyper-parameters of common machine learning models is studied. We introduce several state-of-the-art optimization techniques and discuss how to apply them to machine learning algorithms. Many available libraries and frameworks developed for hyper-parameter optimization problems are provided, and some open challenges of hyper-parameter optimization research are also discussed in this paper. Moreover, experiments are conducted on benchmark datasets to compare the performance of different optimization methods and provide practical examples of hyper-parameter optimization. This survey paper will help industrial users, data analysts, and researchers to better develop machine learning models by identifying the proper hyper-parameter configurations effectively.
Keywords:  
Author(s) Name:  Li Yang, Abdallah Shami
Journal name:  Neurocomputing
Conferrence name:  
Publisher name:  Elsevier
DOI:  10.1016/j.neucom.2020.07.061
Volume Information:  Volume 415, 20 November 2020, Pages 295-316
Paper Link:   https://www.sciencedirect.com/science/article/abs/pii/S0925231220311693