Skip to main content
Log in

A Framework of Conjugate Direction Methods for Symmetric Linear Systems in Optimization

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

In this paper, we introduce a parameter-dependent class of Krylov-based methods, namely Conjugate Directions \((CD)\), for the solution of symmetric linear systems. We give evidence that, in our proposal, we generate sequences of conjugate directions, extending some properties of the standard conjugate gradient (CG) method, in order to preserve the conjugacy. For specific values of the parameters in our framework, we obtain schemes equivalent to both the CG and the scaled-CG. We also prove the finite convergence of the algorithms in \(CD\), and we provide some error analysis. Finally, preconditioning is introduced for \(CD\), and we show that standard error bounds for the preconditioned CG also hold for the preconditioned \(CD\).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. A further generalization might be obtained computing \(\sigma _{k-1}\) and \(\omega _{k-1}\) so that

    $$\begin{aligned} \left\{ \begin{array}{l} p_k^TA(\gamma _{k-1}Ap_{k-1} - \sigma _{k-1}p_{k-1}) = 0, \\ p_k^TAp_{k-2} = 0. \end{array} \right. \end{aligned}$$
    (7)

References

  1. Axelsson, O.: Iterative Solution Methods. Cambridge University Press, Cambridge (1996)

    MATH  Google Scholar 

  2. Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. The John Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  3. Saad, Y.: Iterative Methods for Sparse Linear Systems, 2nd edn. SIAM, Philadelphia (2003)

    Book  MATH  Google Scholar 

  4. Higham, N.J.: Accuracy and Stability of Numerical Algorithms. SIAM, Philadelphia (1996)

    MATH  Google Scholar 

  5. Saad, Y., Van Der Vorst, H.A.: Iterative solution of linear systems in the twentieth century. J. Comput. Appl. Math. 123, 1–33 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  6. Greenbaum, A., Strakos, Z.: Predicting the behavior of finite precision Lanczos and conjugate gradient computations. SIAM J. Matrix Anal. Appl. 13, 121–137 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  7. Greenbaum, A.: Iterative Methods for Solving Linear Systems, vol. SIAM. SIAM, Philadelphia (1997)

    Book  MATH  Google Scholar 

  8. Hestenes, M.R.: Conjugate Direction Methods in Optimization. Springer, New York (1980)

    Book  MATH  Google Scholar 

  9. Nash, S.G.: A survey of truncated-Newton methods. J. Comput. Appl. Math. 124, 45–59 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  10. Conn, A.R., Gould, N.I.M., Toint, PhL: Trust Region Methods. MPS-SIAM Series on Optimization. SIAM, Philadelphia (2000)

    Google Scholar 

  11. Fasano, G.: Planar-conjugate gradient algorithm for large scale unconstrained optimization. Part 2: Application. J. Optim. Theory Appl. 125, 523–541 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  12. Grippo, L., Lampariello, F., Lucidi, S.: A truncated Newton method with nonmonotone linesearch for unconstrained optimization. J. Optim. Theory Appl. 60, 401–419 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  13. Morales, J.L., Nocedal, J.: Automatic preconditioning by limited memory quasi-Newton updating. SIAM J. Optim. 10, 1079–1096 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  14. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–435 (1952)

    Article  MATH  MathSciNet  Google Scholar 

  15. Fasano, G.: Lanczos conjugate-gradient method and pseudoinverse computation on indefinite and singular systems. J. Optim. Theory Appl. 132, 267–285 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  16. Stoer, J.: Solution of large linear systems of equations by conjugate gradient type methods. In: Bachem, A., Grötschel, M., Korte, B. (eds.) Mathematical Programming. The State of the Art, pp. 540–565. Springer, Berlin (1983)

    Chapter  Google Scholar 

  17. Nash, S.G., Sofer, A.: Assessing a search direction within a truncated Newton method. Oper. Res. Lett. 9, 219–221 (1990)

    Article  MATH  MathSciNet  Google Scholar 

  18. Fasano, G., Roma, M.: Iterative computation of negative curvature directions in large scale optimization. Comput. Optim. Appl. 38, 81–104 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  19. Gould, N.I.M., Lucidi, S., Roma, M., Toint, PhL: Exploiting negative curvature directions in linesearch methods for unconstrained optimization. Optim. Methods Softw. 14, 75–98 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  20. Fasano, G., Lucidi, S.: A nonmonotone truncated Newton–Krylov method exploiting negative curvature directions, for large scale unconstrained optimization. Optim. Lett. 3, 521–535 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  21. Nocedal, J., Wright, S.: Numerical Optimization. Springer Series in Operations Research and Financial Engineering, 2nd edn. Springer, New York (2006)

    Google Scholar 

  22. Meurant, G.: The Lanczos and Conjugate Gradient Algorithms—From Theory to Finite Precision Computations. SIAM, Philadelphia (2006)

    Book  Google Scholar 

  23. Polyak, T.B.: Introduction to Optimization, Translation Series in Mathematics and Engineering. Optimization Software Inc., Publications Division, New York (1987)

    Google Scholar 

  24. Campbell, S.L., Meyer, C.D.: Generalized Inverses of Linear Transformations. Dover Publications, New York (1979)

    MATH  Google Scholar 

  25. Fasano, G.: Planar-conjugate gradient algorithm for large scale unconstrained optimization. Part 1: Theory. J. Optim. Theory Appl. 125, 543–558 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  26. Gratton, S., Sartenaer, A., Tshimanga, J.: On a class of limited memory preconditioners for large scale linear systems with multiple right-hand sides. SIAM J. Optim. 21, 912–935 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  27. Fasano, G., Roma, M.: Preconditioning Newton–Krylov methods in non-convex large scale optimization. Comput. Optim. Appl. 56, 253–290 (2013)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

The author is indebted with the anonymous reviewers for their fruitful comments. The author also thanks the Italian national research program ‘RITMARE’, by CNR-INSEAN, National Research Council-Maritime Research Centre, for the support received.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giovanni Fasano.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fasano, G. A Framework of Conjugate Direction Methods for Symmetric Linear Systems in Optimization. J Optim Theory Appl 164, 883–914 (2015). https://doi.org/10.1007/s10957-014-0600-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-014-0600-0

Keywords

Mathematics Subject Classification (2000)

Navigation