A stochastic method for minimizing functions with many minima

Hong Ye1, Zhiping Lin1
1School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore

Tóm tắt

An efficient stochastic method for continuous optimization problems is presented. Combining a novel global search with typical local optimization methods, the proposed method specializes in hard optimization problems such as minimizing multimodal or ill-conditioned unimodal objective functions. Extensive numerical studies show that, starting from a random initial point, the proposed method is always to find the global optimal solution. Computational results in comparison with other global optimization algorithms clearly illustrate the efficiency and accuracy of the method. As traditional supervised neural-network training is formulated as a continuous optimization problem, the method presented can be applied to neural-network learning.

Từ khóa

#Stochastic processes #Optimization methods #Minimization methods #Newton method #Least squares methods #Recursive estimation #Computational modeling #Simulated annealing #Genetics #Algorithm design and analysis

Tài liệu tham khảo

10.1007/978-3-662-03199-5 horst, 1995, Handbook of Global Optimization, 10.1007/978-1-4615-2025-2 10.1145/29380.29864 10.1016/0304-4076(94)90038-8 10.1109/21.286389 floudas, 1999, Handbook of Test Problems in Local and Global Optimization, 10.1007/978-1-4757-3040-1 10.1007/BF01096735 10.1109/3468.844362 pinter, 1996, Global Optimization in Action, 10.1007/978-1-4757-2502-5