This thesis presents techniques for accurately computing a number of fundamental operations on approximate polynomials. The general goal is to determine nearby polynomials which have a non-trivial result for the operation. We proceed by first translating each of the polynomial operations to a particular structured matrix system, constructed to represent dependencies in the polynomial coefficients. Perturbing this matrix system to a nearby system of reduced rank yields the nearby polynomials that have a non-trivial result. The translation from polynomial operation to matrix system permits the use of emerging methods for solving sophisticated least squares problems. These methods introduce the required dependencies in the system in a structured way, ensuring a certain minimization is met. This minimization ensures the determined polynomials are close to the original input. We present translations for the following operations on approximate polynomials: <ul> <li>Division</li> <li>Greatest Common Divisor (GCD)</li> <li>Bivariate Factorization</li> <li>Decomposition</li> </ul> The Least Squares problems considered include classical Least Squares (LS), Total Least Squares (TLS) and Structured Total Least Squares (STLS). In particular, we make use of some recent developments in formulation of STLS, to perturb the matrix system, while maintaining the structure of the original matrix. This allows reconstruction of the resulting polynomials without applying any heuristics or iterative refinements, and guarantees a result for the operation with zero residual. Underlying the methods for the LS, TLS and STLS problems are varying uses of the Singular Value Decomposition (SVD). This decomposition is also a vital tool for deter- mining appropriate matrix rank, and we spend some time establishing the accuracy of the SVD. We present an algorithm for <i>relatively accurate</i> SVD recently introduced in [8], then used to solve LS and TLS problems. The result is confidence in the use of LS and TLS for the polynomial operations, to provide a fair contrast with STLS. The SVD is also used to provide the starting point for our STLS algorithm, with the prescribed guaranteed accuracy. Finally, we present a generalized implementation of the Riemannian SVD (RiSVD), which can be applied on any structured matrix to determine the result for STLS. This has the advantage of being applicable to all of our polynomial operations, with the penalty of decreased efficiency. We also include a novel, yet naive, improvement that relies on ran- domization to increase the efficiency, by converting a rectangular system to one that is square. The results for each of the polynomial operations are presented in detail, and the benefits of each of the Least Squares solutions are considered. We also present distance bounds that confirm our solutions are within an acceptable tolerance.
Identifer | oai:union.ndltd.org:LACETR/oai:collectionscanada.gc.ca:OWTU.10012/1035 |
Date | January 2004 |
Creators | Botting, Brad |
Publisher | University of Waterloo |
Source Sets | Library and Archives Canada ETDs Repository / Centre d'archives des thèses électroniques de Bibliothèque et Archives Canada |
Language | English |
Detected Language | English |
Type | Thesis or Dissertation |
Rights | Copyright: 2004, Botting, Brad. All rights reserved. |
Page generated in 0.0019 seconds