Spelling suggestions: "subject:"1atrix polynomials"" "subject:"béatrix polynomials""
1 |
Matrix polynomials and equationsOlawuyi, Paul O. 01 August 1980 (has links)
The primary intent of this thesis is to uncover the presence of matrices in polynomials and also to demonstrate that wherever there are vast numbers of interlocking relationships that must be handled, it is reasonable to guess that matrices will appear on the scene and lend their strength to facilitate the process. Most important of all, I seriously expose this omnipresent ability of matrices in polynomials and matrix equations. Matrix polynomials and equations are, in the main, a part of algebra, but it has become increasingly clear that they possess a utility that extends beyond the domain of algebra into other regions of mathematics. More than this, we have discovered that they are exactly the means necessary for expressing many ideas of applied mathematics. My thesis illustrates this for polynomials and equations of matrices.
|
2 |
A new block Krylov subspace framework with applications to functions of matrices acting on multiple vectorsLund, Kathryn January 2018 (has links)
We propose a new framework for understanding block Krylov subspace methods, which hinges on a matrix-valued inner product. We can recast the ``classical" block Krylov methods, such as O'Leary's block conjugate gradients, global methods, and loop-interchange methods, within this framework. Leveraging the generality of the framework, we develop an efficient restart procedure and error bounds for the shifted block full orthogonalization method (Sh-BFOM(m)). Regarding BFOM as the prototypical block Krylov subspace method, we propose another formalism, which we call modified BFOM, and show that block GMRES and the new block Radau-Lanczos method can be regarded as modified BFOM. In analogy to Sh-BFOM(m), we develop an efficient restart procedure for shifted BGMRES with restarts (Sh-BGMRES(m)), as well as error bounds. Using this framework and shifted block Krylov methods with restarts as a foundation, we formulate block Krylov subspace methods with restarts for matrix functions acting on multiple vectors f(A)B. We obtain convergence bounds for \bfomfom (BFOM for Functions Of Matrices) and block harmonic methods (i.e., BGMRES-like methods) for matrix functions. With various numerical examples, we illustrate our theoretical results on Sh-BFOM and Sh-BGMRES. We also analyze the matrix polynomials associated to the residuals of these methods. Through a variety of real-life applications, we demonstrate the robustness and versatility of B(FOM)^2 and block harmonic methods for matrix functions. A particularly interesting example is the tensor t-function, our proposed definition for the function of a tensor in the tensor t-product formalism. Despite the lack of convergence theory, we also show that the block Radau-Lanczos modification can reduce the number of cycles required to converge for both linear systems and matrix functions. / Mathematics
|
3 |
Computation of invariant pairs and matrix solvents / Calcul de paires invariantes et solvants matricielsSegura ugalde, Esteban 01 July 2015 (has links)
Cette thèse porte sur certains aspects symboliques-numériques du problème des paires invariantes pour les polynômes de matrices. Les paires invariantes généralisent la définition de valeur propre / vecteur propre et correspondent à la notion de sous-espaces invariants pour le cas nonlinéaire. Elles trouvent leurs applications dans le calcul numérique de plusieurs valeurs propres d’un polynôme de matrices; elles présentent aussi un intérêt dans le contexte des systèmes différentiels. En utilisant une approche basée sur les intégrales de contour, nous déterminons des expressions du nombre de conditionnement et de l’erreur rétrograde pour le problème du calcul des paires invariantes. Ensuite, nous adaptons la méthode des moments de Sakurai-Sugiura au calcul des paires invariantes et nous étudions le comportement de la version scalaire et par blocs de la méthode en présence de valeurs propres multiples. Le résultats obtenus à l’aide des approches directes peuvent éventuellement être améliorés numériquement grâce à une méthode itérative: nous proposons ici une comparaison de deux variantes de la méthode de Newton appliquée aux paires invariantes. Le problème des solvants de matrices est très proche de celui des paires invariants. Le résultats présentés ci-dessus sont donc appliqués au cas des solvants pour obtenir des expressions du nombre de conditionnement et de l’erreur, et un algorithme de calcul basé sur la méthode des moments. De plus, nous étudions le lien entre le problème des solvants et la transformation des polynômes de matrices en forme triangulaire. / In this thesis, we study some symbolic-numeric aspects of the invariant pair problem for matrix polynomials. Invariant pairs extend the notion of eigenvalue-eigenvector pairs, providing a counterpart of invariant subspaces for the nonlinear case. They have applications in the numeric computation of several eigenvalues of a matrix polynomial; they also present an interest in the context of differential systems. Here, a contour integral formulation is applied to compute condition numbers and backward errors for invariant pairs. We then adapt the Sakurai-Sugiura moment method to the computation of invariant pairs, including some classes of problems that have multiple eigenvalues, and we analyze the behavior of the scalar and block versions of the method in presence of different multiplicity patterns. Results obtained via direct approaches may need to be refined numerically using an iterative method: here we study and compare two variants of Newton’s method applied to the invariant pair problem. The matrix solvent problem is closely related to invariant pairs. Therefore, we specialize our results on invariant pairs to the case of matrix solvents, thus obtaining formulations for the condition number and backward errors, and a moment-based computational approach. Furthermore, we investigate the relation between the matrix solvent problem and the triangularization of matrix polynomials.
|
Page generated in 0.0553 seconds