A family of the modified three-term Hestenes–Stiefel conjugate gradient method with sufficient descent and conjugacy conditions
Tóm tắt
To strengthen the three-term Hestenes–Stiefel conjugate gradient method proposed by Zhang et al., we suggest a modified version of it. For this purpose, by considering the Dai–Liao approach, the third term of Zhang et al. method is multiplied by a positive parameter which can be determined adaptively. To render an appropriate choice for the parameter of the search direction, we carry out a matrix analysis by which the sufficient descent property of the method is guaranteed. In the following, convergence analyses are discussed for convex and nonconvex cost functions. Eventually, numerical tests shed light on the efficiency of the performance of the proposed method.
Tài liệu tham khảo
Amini, K., Faramarzi, P., Pirfalah, N.: A modified Hestenes-Stiefel conjugate gradient method with an optimal property. Optim. Methods Softw. 34(4), 770–782 (2019)
Aminifard, Z., Babaie-Kafaki, S.: Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algorithms 89(3), 1369–1387 (2021)
Aminifard, Z., Babaie-Kafaki, S.: Matrix analyses on the Dai-Liao conjugate gradient method. ANZIAM J. 61(2), 195–203 (2019)
Andrei, N.: An adaptive conjugate gradient algorithm for large-scale unconstrained optimization. J. Comput. Appl. Math. 292, 83–91 (2016)
Andrei, N.: A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms 77(4), 1273–1282 (2018)
Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai-Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591 (2014)
Babaie-Kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)
Babaie-Kafaki, S., Ghanbari, R.: A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update. 4OR 15(1), 85–92 (2017)
Beale, E.M.L.: A derivation of conjugate gradients. In: numerical Methods for nonlinear Optimization. Numer. Algorithms 42(1), 63–73 (1972)
Bojari, S., Eslahchi, M.R.: Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization. Numer. Algorithms 83(3), 901–933 (2020)
Bojari, S., Eslahchi, M.R.: A five-parameter class of derivative-free spectral conjugate gradient methods for systems of large-scale nonlinear monotone equations. Int. J. Comput. Methods (2022)
Cao, J., Wu, J.: A conjugate gradient algorithm and its applications in image restoration. Appl. Numer. Math. 152, 243–252 (2020)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)
Dong, X.L., Liu, H.W., He, Y.B., Yang, X.M.: A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition. J. Comput. Appl. Math. 281, 239–249 (2015)
Eslahchi, M.R., Bojari, S.: Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization. Optim. Method. Softw. 37(3), 830–843 (2022)
Exl, L., Fischbacher, J., Kovacs, A., Oezelt, H., Gusenbauer, M., Schrefl, T.: Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization. Comput. Phys. Comm. 235, 179–186 (2019)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Hager, W.W., Zhang, H.: Algorithm 851: CG-descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)
Khoshsimaye-Bargard, M., Ashrafi, A.: A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update. Comput. Appl. Math. 40(8), 1–17 (2021)
Li, L., Xie, X., Gao, T., Wang, J.: A modified conjugate gradient-based Elman neural network. Cogn. Syst. Res. 68, 62–72 (2021)
Nocedal, J., Wright, S.J.: Numerical optimization. Springer, New York (2006)
Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1978)
Powell, M.J.D: Nonconvex minimization calculations and the conjugate gradient method. In: Numerical Analysis (Dundee, 1983), Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984)
Powell, M.J.D.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28(4), 487–500 (1986)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer, New York (2006)
Xue, W., Wan, P., Li, Q., Zhong, P., Yu, G., Tao, T.: An online conjugate gradient algorithm for large-scale data analysis in machine learning. AIMS Math. 6(2), 1515–1537 (2021)
Yao, S., Feng, Q., Li, L., Xu, J.: A class of globally convergent three-term dai-liao conjugate gradient methods. Appl. Numer. Math. 151, 354–366 (2020)
Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Method. Softw. 22(4), 697–711 (2007)