Abstract
We study the role of the recently introduced infinite number grossone, to deal with two renowned Krylov-subspace methods for symmetric (possibly indefinite) linear systems. We preliminarily explore the relationship between the Conjugate Gradient (CG) method and the Lanczos process, along with their specific role of yielding tridiagonal matrices which retain large information on the original linear system matrix. Then, we show that on one hand there is not immediate evidence of an advantage from embedding grossone within the Lanczos process. On the other hand, coupling the CG with grossone shows clear theoretical improvements. Furthermore, reformulating the CG iteration through a grossone-based framework allows to encompass also a certain number of Krylov-subspace methods relying on conjugacy among vectors. The last generalization remarkably justifies the use of a grossone-based reformulation of the CG to solve also indefinite linear systems. Finally, pairing the CG with the algebra of grossone easily provides relevant geometric properties of quadratic hypersurfaces.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
More correctly, we urge to remark that the expressions (33) are obtained neglecting in the quantity \(\alpha _{k+1}\) the infinitesimal terms, i.e. those terms containing negative powers of \(s\textcircled {1}\), that are indeed negligibly small due to the degenerate Step k in CG\(_{\textcircled {1}}\).
References
Antoniotti, L., Caldarola, F., Maiolo, M.: Infinite numerical computing applied to Hilbert’s, Peano’s, and Moore’s curves. Mediterr. J. Math. 17(3) (2020)
Astorino, A., Fuduli, A.: Spherical separation with infinitely far center. Soft Comput. 24, 17751–17759 (2020)
Chandra, R.: Conjugate gradient methods for partial differential equations. Ph.D. thesis, Yale University, New Haven (1978)
Cococcioni, M., Fiaschi, L.: The Big-M method with the numerical infinite M. Optim. Lett. 15(7) (2021)
Cococcioni, M., Pappalardo, M., Sergeyev, Y.D.: Lexicographic multi-objective linear programming using grossone methodology: theory and algorithm. Appl. Math. Comput. 318, 298–311 (2018)
Curtis, F., Robinson, D.: Exploiting negative curvature in deterministic and stochastic optimization. Math. Program. 176, 69–94 (1919)
D’Alotto, L.: Infinite games on finite graphs using grossone. Soft Comput. 55, 143–158 (2020)
De Cosmis, S., De Leone, R.: The use of grossone in mathematical programming and operations research. Appl. Math. Comput. 218(16), 8029–8038 (2012)
De Leone, R.: Nonlinear programming and grossone: quadratic programming and the role of constraint qualifications. Appl. Math. Comput. 318, 290–297 (2018)
De Leone, R., Egidi, N., Fatone, L.: The use of grossone in elastic net regularization and sparse support vector machines. Soft Comput. 24, 17669–17677 (2020)
De Leone, R., Fasano, G., Roma, M., Sergeyev, Y.D.: Iterative grossone-based computation of negative curvature directions in large-scale optimization. J. Optim. Theory Appl. 186(2), 554–589 (2020)
De Leone, R., Fasano, G., Sergeyev, Y.D.: Planar methods and grossone for the conjugate gradient breakdown in nonlinear programming. Comput. Optim. Appl. 71(1), 73–93 (2018)
Fasano, G.: Planar-CG methods and matrix tridiagonalization in large scale unconstrained optimization. In: Di Pillo, G., Murli, A. (eds.) In: High Performance Algorithms and Software for Nonlinear Optimization. Kluwer Academic Publishers, New York (2003)
Fasano, G.: Conjugate Gradient (CG)-type method for the solution of Newton’s equation within optimization frameworks. Optim. Methods Softw. 19(3–4), 267–290 (2004)
Fasano, G.: Planar-Conjugate gradient algorithm for large scale unconstrained optimization, part 1: theory. J. Optim. Theory Appl. 125(3), 523–541 (2005)
Fasano, G.: Planar-Conjugate gradient algorithm for large scale unconstrained optimization, part 2: application. J. Optim. Theory Appl. 125(3), 543–558 (2005)
Fasano, G.: Lanczos conjugate-gradient method and pseudoinverse computation on indefinite and singular systems. J. Optim. Theory Appl. 132(2), 267–285 (2007)
Fasano, G.: A framework of conjugate direction methods for symmetric linear systems in optimization. J. Optim. Theory Appl. 164(3), 883–914 (2015)
Fasano, G., Lucidi, S.: A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization. Optim. Lett. 3(4), 521–535 (2009)
Fasano, G., Roma, M.: Iterative computation of negative curvature directions in large scale optimization. Comput. Optim. Appl. 38(1), 81–104 (2007)
Fiaschi, L., Cococcioni, M.: Numerical asymptotic results in game theory using Sergeyev’s Infinity Computing. Int. J. Unconv. Comput. 14(1) (2018)
Fletcher, R.: Conjugate gradient methods for indefinite systems. In: Watson G.A. (ed.), Proceedings of the Dundee Biennal Conferences on Numerical Analysis. Springer, Berlin Heidelberg New York (1975)
Gaudioso, M., Giallombardo, G., Mukhametzhanov, M.S.: Numerical infinitesimals in a variable metric method for convex nonsmooth optimization. Appl. Math. Comput. 318, 312–320 (2018)
Gould, N., Lucidi, S., Roma, M., Toint, P.: Exploiting negative curvature directions in linesearch methods for unconstrained optimization. Optim. Methods Softw. 14, 75–98 (2000)
Hestenes, M.: Conjugate Direction Methods in Optimization. Springer, New York, Heidelberg, Berlin (1980)
Hestenes, M., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)
Higham, N.: Accuracy and Stability of Numerical Algorithms. SIAM, Philadelphia (1996)
Iavernaro, F., Mazzia, F., Mukhametzhanov, M.S., Sergeyev, Y.D.: Computation of higher order Lie derivatives on the Infinity Computer. J. Comput. Appl. Math. 383 (2021)
Lanczos, C.: An iterative method for the solution of the eigenvalue problem of linear differential and integral operators. J. Res. Nat. Bureau Stand. 45(4), Research Paper 2133 (1950)
Lucidi, S., Rochetich, F., Roma, M.: Curvilinear stabilization techniques for Truncated Newton methods in large scale unconstrained optimization. SIAM J. Optim. 8(4), 916–939 (1999)
Luenberger, D.G.: Hyperbolic Pairs in the method of conjugate gradients. SIAM J. Appl. Math. 17, 1263–1267 (1996)
Mazzia, F., Sergeyev, Y.D., Iavernaro, F., Amodio, P., Mukhametzhanov, M.S.: Numerical methods for solving ODEs on the Infinity Computer. In: Sergeyev, Y.D., Kvasov, D.E., Dell’Accio, F., Mukhametzhanov, M.S. (eds.) Proceedings of the 2nd International Conferences “Numerical Computations: Theory and Algorithms”, vol. 1776, p. 090033. AIP Publishing, New York (2016)
McCormick, G.: A modification of Armijo’s step-size rule for negative curvature. Math. Program. 13(1), 111–115 (1977)
Moré, J., Sorensen, D.: On the use of directions of negative curvature in a modified Newton method. Math. Program. 16, 1–20 (1979)
Paige, C., Saunders, M.: Solution of sparse indefinite systems of linear equations. SIAM J. Numer. Anal. 12, 617–629 (1975)
Pepelyshev, A., Zhigljavsky, A.: Discrete uniform and binomial distributions with infinite support. Soft Comput. 24, 17517–17524 (2020)
Sergeyev, Y.D.: Arithmetic of Infinity. Edizioni Orizzonti Meridionali, CS, 2nd ed. (2013)
Sergeyev, Y.D.: Lagrange Lecture: methodology of numerical computations with infinities and infinitesimals. Rendiconti del Seminario Matematico dell’Università e del Politecnico di Torino 68(2), 95–113 (2010)
Sergeyev, Y.D.: Higher order numerical differentiation on the Infinity Computer. Optim. Lett. 5(4), 575–585 (2011)
Sergeyev, Y.D.: Computations with grossone-based infinities. In: Calude, C.S., Dinneen, M.J. (eds.), Unconventional Computation and Natural Computation: Proceedings of the 14th International Conference UCNC 2015, LNCS, vol. 9252 , pp. 89–106. Springer, New York (2015)
Sergeyev, Y.D.: Un semplice modo per trattare le grandezze infinite ed infinitesime. Matematica nella Società e nella Cultura: Rivista della Unione Matematica Italiana 8(1), 111–147 (2015)
Sergeyev, Y.D.: Numerical infinities and infinitesimals: methodology, applications, and repercussions on two Hilbert problems. EMS Surv. Math. Sci. 4(2), 219–320 (2017)
Sergeyev, Y.D.: Independence of the grossone-based infinity methodology from non-standard analysis and comments upon logical fallacies in some texts asserting the opposite. Found. Sci. 24(1) (2019)
Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On strong homogeneity of a class of global optimization algorithms working with infinite and infinitesimal scales. Commun. Nonlinear Sci. Numer. Simul. 59, 319–330 (2018)
Žilinskas, A.: On strong homogeneity of two global optimization algorithms based on statistical models of multimodal objective functions. Appl. Math. Comput. 218(16), 8131–8136 (2012)
Acknowledgements
The author is thankful to the Editors of the present volume for their great efforts and constant commitment. The author is also grateful for the support he received by both the National Research Council–Marine Technology Research Institute (CNR-INM), and the National Research Group GNCS (Gruppo Nazionale per il Calcolo Scientifico) within IN\(\delta \)AM, Istituto Nazionale di Alta Matematica, Italy.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Fasano, G. (2022). Krylov-Subspace Methods for Quadratic Hypersurfaces: A Grossone–based Perspective. In: Sergeyev, Y.D., De Leone, R. (eds) Numerical Infinities and Infinitesimals in Optimization. Emergence, Complexity and Computation, vol 43. Springer, Cham. https://doi.org/10.1007/978-3-030-93642-6_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-93642-6_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-93641-9
Online ISBN: 978-3-030-93642-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)