Skip to main content

Quasi-Newton Based Preconditioning and Damped Quasi-Newton Schemes for Nonlinear Conjugate Gradient Methods

  • Conference paper
  • First Online:
Numerical Analysis and Optimization (NAO 2017)

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 235))

Included in the following conference series:

Abstract

In this paper, we deal with matrix-free preconditioners for nonlinear conjugate gradient (NCG) methods. In particular, we review proposals based on quasi-Newton updates, and either satisfying the secant equation or a secant-like equation at some of the previous iterates. Conditions are given proving that, in some sense, the proposed preconditioners also approximate the inverse of the Hessian matrix. In particular, the structure of the preconditioners depends both on low-rank updates along with some specific parameters. The low-rank updates are obtained as by-product of NCG iterations. Moreover, we consider the possibility to embed damped techniques within a class of preconditioners based on quasi-Newton updates. Damped methods have proved to be effective to enhance the performance of quasi-Newton updates, in those cases where the Wolfe linesearch conditions are hardly fulfilled. The purpose is to extend the idea behind damped methods also to improve NCG schemes, following a novel line of research in the literature. The results, which summarize an extended numerical experience using large-scale CUTEst problems, is reported, showing that these approaches can considerably improve the performance of NCG methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, New York (2006)

    MATH  Google Scholar 

  2. Conn, A.R., Gould, N.I.M., Toint, PhL: Trust region methods. MOS-SIAM Series on Optimization, Philadelphia (2000)

    Google Scholar 

  3. Hager, W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  4. Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  5. Hager, W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)

    Article  MathSciNet  Google Scholar 

  6. Raydan, M.: The Barzilai and Borwein gradient method for large scale unconstrained minimization problems. SIAM J. Optim. 7, 26–33 (1997)

    Article  MathSciNet  Google Scholar 

  7. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  8. Polak, E., Ribière, G.: Note sur la convergence de methodes de directions conjuguees. Rev. Fr. d’Informatique Rech. Operationnelle serie rouge 3(1), 35–43 (1969)

    MATH  Google Scholar 

  9. Hestenes, M., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  10. Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  11. Andrei, N.: An adaptive conjugate gradient algorithm for large-scale unconstrained optimization. J. Comput. Appl. Math. 292, 83–91 (2016)

    Article  MathSciNet  Google Scholar 

  12. Caliciotti, A., Fasano, G., Roma, M.: Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods. Optim. Lett. 11, 835–853 (2017)

    Article  MathSciNet  Google Scholar 

  13. Caliciotti, A., Fasano, G., Roma, M.: Preconditioning strategies for nonlinear conjugate gradient methods, based on quasi-Newton updates. AIP Conf. Proc. Am. Inst. Phys. [Sergeyev Y.D., Kvasov D.E., Dell’Accio F., Mukhametzhanov M.S. (eds.)] 1776(090007):1–4 (2016)

    Google Scholar 

  14. Caliciotti, A., Fasano, G., Roma, M.: Preconditioned nonlinear Conjugate Gradient methods based on a modified secant equation. Appl. Math. Comput. 318, 196–214 (2018)

    MathSciNet  Google Scholar 

  15. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22, 561–571 (2007)

    Article  MathSciNet  Google Scholar 

  16. Morales, J.L., Nocedal, J.: Automatic preconditioning by limited memory quasi-Newton updating. SIAM J. Optim. 10, 1079–1096 (2000)

    Article  MathSciNet  Google Scholar 

  17. Gratton, S., Sartenaer, A., Tshimanga, J.: On a class of limited memory preconditioners for large scale linear systems with multiple right-hand sides. SIAM J. Optim. 21, 912–935 (2011)

    Article  MathSciNet  Google Scholar 

  18. Al-Baali, M., Caliciotti, A., Fasano, G., Roma, M.: Exploiting damped techniques for nonlinear conjugate gradient methods, to appear on Mathematical Methods of Operations Research (2017). https://doi.org/10.1007/s00186-017-0593-1

    Article  MathSciNet  Google Scholar 

  19. Powell, M.J.D.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14, 224–248 (1978)

    Article  MathSciNet  Google Scholar 

  20. Al-Baali, M., Grandinetti, L.: On practical modifications of the quasi-Newton BFGS methods. AMO Adv. Model. Optim. 11, 63–76 (2009)

    MathSciNet  MATH  Google Scholar 

  21. Al-Baali, M., Grandinetti, L., Pisacane, O.: Damped techniques for the limited memory BFGS method for large-scale optimization. J. Optim. Theory Appl. 161, 688–699 (2014)

    Article  MathSciNet  Google Scholar 

  22. Fasano, G., Roma, M.: Preconditioning Newton-Krylov methods in nonconvex large scale optimization. Comput. Optim. Appl. 56, 253–290 (2013)

    Article  MathSciNet  Google Scholar 

  23. Gould, N.I.M., Orban, D., Toint, PhL: CUTEst: a constrained and unconstrained testing environment with safe threads. Comput. Optim. Appl. 60, 545–557 (2015)

    Article  MathSciNet  Google Scholar 

  24. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

    Article  MathSciNet  Google Scholar 

  25. Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, New York (2004)

    Book  Google Scholar 

  26. Al-Baali, M.: Damped techniques for enforcing convergence of quasi-Newton methods. Optim. Methods Softw. 29, 919–936 (2014)

    Article  MathSciNet  Google Scholar 

  27. Al-Baali, M.: Quasi-Newton algorithms for large-scale nonlinear least-squares. In: Di Pillo, G., Murli, A. (eds.) High Performance Algorithms and Software for Nonlinear Optimization, pp. 1–21. Kluwer Academic, Dordrecht (2003)

    MATH  Google Scholar 

  28. Moré, J., Thuente, D.: Line search algorithms with guaranteed sufficient decrease. ACM Trans. Math. Softw. (TOMS) 20, 286–307 (1994)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The research is partially supported by the Italian Flagship Project RITMARE, coordinated by the Italian National Research Council and funded by the Italian Ministry of Education, University and Research.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giovanni Fasano .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Al-Baali, M., Caliciotti, A., Fasano, G., Roma, M. (2018). Quasi-Newton Based Preconditioning and Damped Quasi-Newton Schemes for Nonlinear Conjugate Gradient Methods. In: Al-Baali, M., Grandinetti, L., Purnama, A. (eds) Numerical Analysis and Optimization. NAO 2017. Springer Proceedings in Mathematics & Statistics, vol 235. Springer, Cham. https://doi.org/10.1007/978-3-319-90026-1_1

Download citation

Publish with us

Policies and ethics