Research Area:  Metaheuristic Computing
In this study, a novel metaheuristic optimization algorithm, gradient-based optimizer (GBO) is proposed. The GBO, inspired by the gradient-based Newton’s method, uses two main operators: gradient search rule (GSR) and local escaping operator (LEO) and a set of vectors to explore the search space. The GSR employs the gradient-based method to enhance the exploration tendency and accelerate the convergence rate to achieve better positions in the search space. The LEO enables the proposed GBO to escape from local optima. The performance of the new algorithm was evaluated in two phases. 28 mathematical test functions were first used to evaluate various characteristics of the GBO, and then six engineering problems were optimized by the GBO. In the first phase, the GBO was compared with five existing optimization algorithms, indicating that the GBO yielded very promising results due to its enhanced capabilities of exploration, exploitation, convergence, and effective avoidance of local optima. The second phase also demonstrated the superior performance of the GBO in solving complex real-world engineering problems.
Keywords:  
Gradient-based optimizer
metaheuristic optimization algorithm
local escaping operator (LEO)
gradient search rule (GSR)
Author(s) Name:  Iman Ahmadianfar, Omid Bozorg-Haddad, Xuefeng Chu
Journal name:  Information Sciences
Conferrence name:  
Publisher name:  Elsevier
DOI:  10.1016/j.ins.2020.06.037
Volume Information:  Volume 540, November 2020, Pages 131-159
Paper Link:   https://www.sciencedirect.com/science/article/abs/pii/S0020025520306241