Abstract
In this paper, we deal with matrix-free preconditioners for nonlinear conjugate gradient (NCG) methods. In particular, we review proposals based on quasi-Newton updates, and either satisfying the secant equation or a secant-like equation at some of the previous iterates. Conditions are given proving that, in some sense, the proposed preconditioners also approximate the inverse of the Hessian matrix. In particular, the structure of the preconditioners depends both on low-rank updates along with some specific parameters. The low-rank updates are obtained as by-product of NCG iterations. Moreover, we consider the possibility to embed damped techniques within a class of preconditioners based on quasi-Newton updates. Damped methods have proved to be effective to enhance the performance of quasi-Newton updates, in those cases where the Wolfe linesearch conditions are hardly fulfilled. The purpose is to extend the idea behind damped methods also to improve NCG schemes, following a novel line of research in the literature. The results, which summarize an extended numerical experience using large-scale CUTEst problems, is reported, showing that these approaches can considerably improve the performance of NCG methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, New York (2006)
Conn, A.R., Gould, N.I.M., Toint, PhL: Trust region methods. MOS-SIAM Series on Optimization, Philadelphia (2000)
Hager, W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)
Raydan, M.: The Barzilai and Borwein gradient method for large scale unconstrained minimization problems. SIAM J. Optim. 7, 26–33 (1997)
Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Polak, E., Ribière, G.: Note sur la convergence de methodes de directions conjuguees. Rev. Fr. d’Informatique Rech. Operationnelle serie rouge 3(1), 35–43 (1969)
Hestenes, M., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Andrei, N.: An adaptive conjugate gradient algorithm for large-scale unconstrained optimization. J. Comput. Appl. Math. 292, 83–91 (2016)
Caliciotti, A., Fasano, G., Roma, M.: Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods. Optim. Lett. 11, 835–853 (2017)
Caliciotti, A., Fasano, G., Roma, M.: Preconditioning strategies for nonlinear conjugate gradient methods, based on quasi-Newton updates. AIP Conf. Proc. Am. Inst. Phys. [Sergeyev Y.D., Kvasov D.E., Dell’Accio F., Mukhametzhanov M.S. (eds.)] 1776(090007):1–4 (2016)
Caliciotti, A., Fasano, G., Roma, M.: Preconditioned nonlinear Conjugate Gradient methods based on a modified secant equation. Appl. Math. Comput. 318, 196–214 (2018)
Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22, 561–571 (2007)
Morales, J.L., Nocedal, J.: Automatic preconditioning by limited memory quasi-Newton updating. SIAM J. Optim. 10, 1079–1096 (2000)
Gratton, S., Sartenaer, A., Tshimanga, J.: On a class of limited memory preconditioners for large scale linear systems with multiple right-hand sides. SIAM J. Optim. 21, 912–935 (2011)
Al-Baali, M., Caliciotti, A., Fasano, G., Roma, M.: Exploiting damped techniques for nonlinear conjugate gradient methods, to appear on Mathematical Methods of Operations Research (2017). https://doi.org/10.1007/s00186-017-0593-1
Powell, M.J.D.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14, 224–248 (1978)
Al-Baali, M., Grandinetti, L.: On practical modifications of the quasi-Newton BFGS methods. AMO Adv. Model. Optim. 11, 63–76 (2009)
Al-Baali, M., Grandinetti, L., Pisacane, O.: Damped techniques for the limited memory BFGS method for large-scale optimization. J. Optim. Theory Appl. 161, 688–699 (2014)
Fasano, G., Roma, M.: Preconditioning Newton-Krylov methods in nonconvex large scale optimization. Comput. Optim. Appl. 56, 253–290 (2013)
Gould, N.I.M., Orban, D., Toint, PhL: CUTEst: a constrained and unconstrained testing environment with safe threads. Comput. Optim. Appl. 60, 545–557 (2015)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)
Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, New York (2004)
Al-Baali, M.: Damped techniques for enforcing convergence of quasi-Newton methods. Optim. Methods Softw. 29, 919–936 (2014)
Al-Baali, M.: Quasi-Newton algorithms for large-scale nonlinear least-squares. In: Di Pillo, G., Murli, A. (eds.) High Performance Algorithms and Software for Nonlinear Optimization, pp. 1–21. Kluwer Academic, Dordrecht (2003)
Moré, J., Thuente, D.: Line search algorithms with guaranteed sufficient decrease. ACM Trans. Math. Softw. (TOMS) 20, 286–307 (1994)
Acknowledgements
The research is partially supported by the Italian Flagship Project RITMARE, coordinated by the Italian National Research Council and funded by the Italian Ministry of Education, University and Research.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Al-Baali, M., Caliciotti, A., Fasano, G., Roma, M. (2018). Quasi-Newton Based Preconditioning and Damped Quasi-Newton Schemes for Nonlinear Conjugate Gradient Methods. In: Al-Baali, M., Grandinetti, L., Purnama, A. (eds) Numerical Analysis and Optimization. NAO 2017. Springer Proceedings in Mathematics & Statistics, vol 235. Springer, Cham. https://doi.org/10.1007/978-3-319-90026-1_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-90026-1_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-90025-4
Online ISBN: 978-3-319-90026-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)