Abstract
In this paper we study new preconditioners to be used within the nonlinear conjugate gradient (NCG) method, for large scale unconstrained optimization. The rationale behind our proposal draws inspiration from quasi-Newton updates, and its aim is to possibly approximate in some sense the inverse of the Hessian matrix. In particular, at the current iteration of the NCG we consider some preconditioners based on new low-rank quasi-Newton symmetric updating formulae, obtained as by-product of the NCG method at the previous steps. The results of an extensive numerical experience are also reported, showing the effectiveness, the efficiency and the robustness of this approach, which suggests promising guidelines for further studies.
Similar content being viewed by others
References
Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22, 561–571 (2007)
Buckley, B., Lenir, A.: QN-like variable storage conjugate gradients. Math. Program. 27, 155–175 (1983)
Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Dennis, J., Wolkowicz, H.: Sizing and least-change secant methods. SIAM J. Numer. Anal. 30, 1291–1314 (1993)
Dolan, E.D., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fasano, G., Roma, M.: Iterative computation of negative curvature directions in large scale optimization. Comput. Optim. Appl. 38, 81–104 (2007)
Fasano, G., Roma, M.: Preconditioning Newton–Krylov methods in nonconvex large scale optimization. Comput. Optim. Appl. 56, 253–290 (2013)
Fasano, G., Roma, M.: A novel class of approximate inverse preconditioners for large positive definite systems. Comput. Optim. Appl. Online first (2015). doi:10.1007/s10589-015-9765-1
Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Gilbert, J., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)
Golub, G., Van Loan, C.: Matrix Computations, 3rd edn. The John Hopkins Press, Baltimore (1996)
Gould, N.I.M., Orban, D., Toint, P.L.: CUTEst: a constrained and unconstrained testing environment with safe threads. Comput. Optim. Appl. 60, 545–557 (2015)
Gratton, S., Sartenaer, A., Tshimanga, J.: On a class of limited memory preconditioners for large scale linear systems with multiple right-hand sides. SIAM J. Optim. 21, 912–935 (2011)
Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hager, W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)
Hestenes, M., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Liu, D., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)
Morales, J., Nocedal, J.: Automatic preconditioning by limited memory quasi-Newton updating. SIAM J. Optim. 10, 1079–1096 (2000)
Moré, J., Thuente, D.: Line search algorithms with guaranteed sufficient decrease. ACM Trans. Math. Softw. (TOMS) 20, 286–307 (1994)
Nazareth, L.: A relationship between the BFGS and conjugate gradient algorithms and its implications for new algorithms. SIAM J. Numer. Anal. 16, 794–800 (1979)
Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35, 773–782 (1980)
Nocedal, J., Wright, S.: Numerical Optimization, 2nd edn. Springer, New York (2006)
Polak, E., Ribiere, G.: Note sur la convergence de methodes de directions conjuguees. Revue Francaise d’Informatique et de Recherche Operationnelle, serie rouge, tome 3(1), 35–43 (1969)
Pytlak, R.: Conjugate Gradient Algorithms in Nonconvex Optimization. Springer, Berlin (2009)
Shanno, D.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978)
Acknowledgments
G. Fasano thanks the National Research Council-Marine Technology Research Institute (CNR-INSEAN), for the indirect support in project RITMARE 2012–2016. The authors wish to thank the anonymous referees for the helpful comments and suggestions which led to improve the paper.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Andrea, C., Giovanni, F. & Massimo, R. Novel preconditioners based on quasi–Newton updates for nonlinear conjugate gradient methods. Optim Lett 11, 835–853 (2017). https://doi.org/10.1007/s11590-016-1060-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-016-1060-2