A modified BFGS algorithm based on a hybrid secant equation
Tóm tắt
By making a convex combination of the modified secant equations proposed by Yuan and Wei et al., a hybrid secant equation and also, a modified BFGS algorithm is proposed. The hybridization parameter is effectively computed using the available information of recent iterations. Under proper conditions, it is shown that the proposed algorithm is globally, locally and superlinearly convergent. By using the performance profile introduced by Dolan and Moré, a comparison between the implementations of the proposed algorithm and two efficient modified BFGS algorithms proposed by Yuan and Wei et al., on a set of unconstrained optimization test problems from the CUTEr collection, is done. Numerical results demonstrating the efficiency of the proposed modified BFGS algorithm are reported.
Tài liệu tham khảo
Amini K, Ghorbani Rizi A. A new structured quasi-Newton algorithm using partial information on Hessian. J Comput Appl Math, 2010, 234: 805–811
Babaie-Kafaki S, Ghanbari R, Mahdavi-Amiri N. Two new conjugate gradient methods based on modified secant equations. J Comput Appl Math, 2010, 234: 1374–1386
Broyden C G. The convergence of a class of double-rank minimization algorithms. II. The new algorithm. J Inst Math Appl, 1970, 6: 222–231
Broyden C G, Dennis Jr J E, Moré J J. On the local and superlinear convergence of quasi-Newton methods. J Inst Math Appl, 1973, 12: 223–245
Byrd R H, Nocedal J, Yuan Y X. Global convergence of a class of quasi-Newton methods on convex problems. SIAM J Numer Anal, 1987, 24: 1171–1190
Chen L H, Deng N Y, Zhang J Z. A modified quasi-Newton method for structured optimization with partial information on the Hessian. Comput Optim Appl, 2006, 35: 5–18
Davidon W C. Conic approximations and collinear scalings for optimizers. SIAM J Numer Anal, 1980, 17: 268–281
Dennis Jr J E, Martńez H J, Tapia R A. Convergence theory for the structured BFGS secant method with an application to nonlinear least squares. J Optim Theory Appl, 1989, 61: 161–178
Dolan E D, Moré J J. Benchmarking optimization software with performance profiles. Math Program, 2002, 91: 201–213
Fletcher R. A new approach to variable metric algorithms. Comput J, 1970, 13: 317–322
Ford J A. A survey of multi-step quasi-Newton methods. In: Proceedings of the International Conference on Scientific Computations (Beirut, Lebanon), 1999
Ford J A, Moghrabi I A. Multi-step quasi-Newton methods for optimization. J Comput Appl Math, 1994, 50: 305–323
Ford J A, Moghrabi I A. Using function-values in multi-step quasi-Newton methods. J Comput Appl Math, 1996, 66: 201–211
Goldfarb D. A family of variable-metric methods derived by variational means. Math Comp, 1970, 24: 23–26
Gould N I M, Orban D, Toint P L. CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw, 2003, 29: 373–394
Guo Q, Liu J G, Wang D H. A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule. J Appl Math Comput, 2008, 28: 435–446
Huang H Y. Unified approach to quadratically convergent algorithms for function minimization. J Optim Theory Appl, 1970, 5: 405–423
Li D H, Fukushima M. A modified BFGS method and its global convergence in nonconvex minimization. J Comput Appl Math, 2001, 129: 15–35
Nocedal J, Wright S J. Numerical Optimization, 2nd ed. New York: Springer, 2006
Powell M J D. Some global convergence properties of a variable metric algorithm for minimization without exact line searches. In: Cottle R W, Lemke C E, eds. Nonlinear Programming SIAM-AMS Proceeding, 1976, 53–72
Shanno D F. Conditioning of quasi-Newton methods for function minimization. Math Comp, 1970, 24: 647–656
Sun W, Yuan Y X. Optimization Theory and Methods: Nonlinear Programming. New York: Springer, 2006
Wei Z, Li G, Qi L. New quasi-Newton methods for unconstrained optimization problems. Appl Math Comput, 2006, 175: 1156–1188
Wei Z, Yu G, Yuan G, et al. The superlinear convergence of a modified BFGS-type method for unconstrained optimization. Comput Optim Appl, 2004, 29: 315–332
Yabe H, Ogasawara H, Yoshino M. Local and superlinear convergence of quasi-Newton methods based on modified secant conditions. J Comput Appl Math, 2007, 205: 617–632
Yabe H, Takano M. Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput Optim Appl, 2004, 28: 203–225
Yuan Y X. A modified BFGS algorithm for unconstrained optimization. IMA J Numer Anal, 1991, 11: 325–332
Yuan Y X, Byrd R H. Non-quasi-Newton updates for unconstrained optimization. J Comput Math, 1995, 13: 95–107
Zhang J, Xu C. Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J Comput Appl Math, 2001, 137: 269–278
Zhang J Z, Deng N Y, Chen L H. New quasi-Newton equation and related methods for unconstrained optimization. J Optim Theory Appl, 1999, 102: 147–167