• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Recycling Techniques for Sequences of Linear Systems and Eigenproblems

Carr, Arielle Katherine Grim 09 July 2021 (has links)
Sequences of matrices arise in many applications in science and engineering. In this thesis we consider matrices that are closely related (or closely related in groups), and we take advantage of the small differences between them to efficiently solve sequences of linear systems and eigenproblems. Recycling techniques, such as recycling preconditioners or subspaces, are popular approaches for reducing computational cost. In this thesis, we introduce two novel approaches for recycling previously computed information for a subsequent system or eigenproblem, and demonstrate good results for sequences arising in several applications. Preconditioners are often essential for fast convergence of iterative methods. However, computing a good preconditioner can be very expensive, and when solving a sequence of linear systems, we want to avoid computing a new preconditioner too often. Instead, we can recycle a previously computed preconditioner, for which we have good convergence behavior of the preconditioned system. We propose an update technique we call the sparse approximate map, or SAM update, that approximately maps one matrix to another matrix in our sequence. SAM updates are very cheap to compute and apply, preserve good convergence properties of a previously computed preconditioner, and help to amortize the cost of that preconditioner over many linear solves. When solving a sequence of eigenproblems, we can reduce the computational cost of constructing the Krylov space starting with a single vector by warm-starting the eigensolver with a subspace instead. We propose an algorithm to warm-start the Krylov-Schur method using a previously computed approximate invariant subspace. We first compute the approximate Krylov decomposition for a matrix with minimal residual, and use this space to warm-start the eigensolver. We account for the residual matrix when expanding, truncating, and deflating the decomposition and show that the norm of the residual monotonically decreases. This method is effective in reducing the total number of matrix-vector products, and computes an approximate invariant subspace that is as accurate as the one computed with standard Krylov-Schur. In applications where the matrix-vector products require an implicit linear solve, we incorporate Krylov subspace recycling. Finally, in many applications, sequences of matrices take the special form of the sum of the identity matrix, a very low-rank matrix, and a small-in-norm matrix. We consider convergence rates for GMRES applied to these matrices by identifying the sources of sensitivity. / Doctor of Philosophy / Problems in science and engineering often require the solution to many linear systems, or a sequence of systems, that model the behavior of physical phenomena. In order to construct highly accurate mathematical models to describe this behavior, the resulting matrices can be very large, and therefore the linear system can be very expensive to solve. To efficiently solve a sequence of large linear systems, we often use iterative methods, which can require preconditioning techniques to achieve fast convergence. The preconditioners themselves can be very expensive to compute. So, we propose a cheap update technique that approximately maps one matrix to another in the sequence for which we already have a good preconditioner. We then combine the preconditioner and the map and use the updated preconditioner for the current system. Sequences of eigenvalue problems also arise in many scientific applications, such as those modeling disk brake squeal in a motor vehicle. To accurately represent this physical system, large eigenvalue problems must be solved. The behavior of certain eigenvalues can reveal instability in the physical system but to identify these eigenvalues, we must solve a sequence of very large eigenproblems. The eigensolvers used to solve eigenproblems generally begin with a single vector, and instead, we propose starting the method with several vectors, or a subspace. This allows us to reduce the total number of iterations required by the eigensolver while still producing an accurate solution. We demonstrate good results for both of these approaches using sequences of linear systems and eigenvalue problems arising in several real-world applications. Finally, in many applications, sequences of matrices take the special form of the sum of the identity matrix, a very low-rank matrix, and a small-in-norm matrix. We examine the convergence behavior of the iterative method GMRES when solving such a sequence of matrices.
2

Reusing and Updating Preconditioners for Sequences of Matrices

Grim-McNally, Arielle Katherine 15 June 2015 (has links)
For sequences of related linear systems, the computation of a preconditioner for every system can be expensive. Often a fixed preconditioner is used, but this may not be effective as the matrix changes. This research examines the benefits of both reusing and recycling preconditioners, with special focus on ILUTP and factorized sparse approximate inverses and proposes an update that we refer to as a sparse approximate map or SAM update. Analysis of the residual and eigenvalues of the map will be provided. Applications include the Quantum Monte Carlo method, model reduction, oscillatory hydraulic tomography, diffuse optical tomography, and Helmholtz-type problems. / Master of Science

Page generated in 0.1392 seconds