Skip to main content
Log in

Novel preconditioners based on quasi–Newton updates for nonlinear conjugate gradient methods

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this paper we study new preconditioners to be used within the nonlinear conjugate gradient (NCG) method, for large scale unconstrained optimization. The rationale behind our proposal draws inspiration from quasi-Newton updates, and its aim is to possibly approximate in some sense the inverse of the Hessian matrix. In particular, at the current iteration of the NCG we consider some preconditioners based on new low-rank quasi-Newton symmetric updating formulae, obtained as by-product of the NCG method at the previous steps. The results of an extensive numerical experience are also reported, showing the effectiveness, the efficiency and the robustness of this approach, which suggests promising guidelines for further studies.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22, 561–571 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  2. Buckley, B., Lenir, A.: QN-like variable storage conjugate gradients. Math. Program. 27, 155–175 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  3. Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  4. Dennis, J., Wolkowicz, H.: Sizing and least-change secant methods. SIAM J. Numer. Anal. 30, 1291–1314 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  5. Dolan, E.D., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  6. Fasano, G., Roma, M.: Iterative computation of negative curvature directions in large scale optimization. Comput. Optim. Appl. 38, 81–104 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  7. Fasano, G., Roma, M.: Preconditioning Newton–Krylov methods in nonconvex large scale optimization. Comput. Optim. Appl. 56, 253–290 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  8. Fasano, G., Roma, M.: A novel class of approximate inverse preconditioners for large positive definite systems. Comput. Optim. Appl. Online first (2015). doi:10.1007/s10589-015-9765-1

  9. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  10. Gilbert, J., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  11. Golub, G., Van Loan, C.: Matrix Computations, 3rd edn. The John Hopkins Press, Baltimore (1996)

    MATH  Google Scholar 

  12. Gould, N.I.M., Orban, D., Toint, P.L.: CUTEst: a constrained and unconstrained testing environment with safe threads. Comput. Optim. Appl. 60, 545–557 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  13. Gratton, S., Sartenaer, A., Tshimanga, J.: On a class of limited memory preconditioners for large scale linear systems with multiple right-hand sides. SIAM J. Optim. 21, 912–935 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  14. Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  15. Hager, W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  16. Hager, W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  17. Hestenes, M., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  18. Liu, D., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  19. Morales, J., Nocedal, J.: Automatic preconditioning by limited memory quasi-Newton updating. SIAM J. Optim. 10, 1079–1096 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  20. Moré, J., Thuente, D.: Line search algorithms with guaranteed sufficient decrease. ACM Trans. Math. Softw. (TOMS) 20, 286–307 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  21. Nazareth, L.: A relationship between the BFGS and conjugate gradient algorithms and its implications for new algorithms. SIAM J. Numer. Anal. 16, 794–800 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  22. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35, 773–782 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  23. Nocedal, J., Wright, S.: Numerical Optimization, 2nd edn. Springer, New York (2006)

  24. Polak, E., Ribiere, G.: Note sur la convergence de methodes de directions conjuguees. Revue Francaise d’Informatique et de Recherche Operationnelle, serie rouge, tome 3(1), 35–43 (1969)

    MATH  Google Scholar 

  25. Pytlak, R.: Conjugate Gradient Algorithms in Nonconvex Optimization. Springer, Berlin (2009)

    MATH  Google Scholar 

  26. Shanno, D.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

G. Fasano thanks the National Research Council-Marine Technology Research Institute (CNR-INSEAN), for the indirect support in project RITMARE 2012–2016. The authors wish to thank the anonymous referees for the helpful comments and suggestions which led to improve the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fasano Giovanni.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Andrea, C., Giovanni, F. & Massimo, R. Novel preconditioners based on quasi–Newton updates for nonlinear conjugate gradient methods. Optim Lett 11, 835–853 (2017). https://doi.org/10.1007/s11590-016-1060-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-016-1060-2

Keywords

Navigation