As data-driven modeling becomes more prevalent for representing the uncertain dynamical systems, concerns also arise regarding the reliability of these methods. Recent developments in approximation theory provide a new perspective to studying these problems. This dissertation analyzes the convergence of two kernel-based, data-driven modeling methods, the reproducing kernel Hilbert space (RKHS) embedding method and the empirical-analytical Lagrangian (EAL) model. RKHS embedding is a non-parametric extension of the classical adaptive estimation method that embeds the uncertain function in an RKHS, an infinite-dimensional function space. As a result the original uncertain system of ordinary differential equations are understood as components of a distributed parameter system. Similarly to the classical approach for adaptive estimation, a novel definition of persistent excitation (PE) is introduced, which is proven to guarantee the pointwise convergence of the estimate of function over the PE domain. The finite-dimensional approximation of the RKHS embedding method is based on approximant spaces that consist of kernel basis functions centered at samples in the state space. This dissertation shows that explicit rate of convergence of the RKHS embedding method can be derived by choosing specific types of native spaces. In particular, when the RKHS is continuously embedded in a Sobolev space, the approximation error is proven to decrease at a rate determined by the fill distance of the samples in the PE domain. This dissertation initially studies scalar-valued RKHS, and subsequently the RKHS embedding method is extended for the estimation of vector-valued uncertain functions. Like the scalar-valued case, the formulation of vector-valued RKHS embedding is proven to be well-posed. The notion of partial PE is also generalized, and it is shown that the rate of convergence derived for the scalar-valued approximation still holds true for certain separable operator-valued kernels. The second part of this dissertation studies the EAL modeling method, which is a hybrid mechanical model for Lagrangian systems with uncertain holonomic constraints. For the singular perturbed form of the system, the kernel method is applied to approximate a penalty potential that is introduced to approximately enforce constraints. In this dissertation, the accuracy confidence function is introduced to characterize the constraint violation of an approximate trajectory. We prove that the confidence function can be decomposed into a term representing the bias and another term representing the variation. Numerical simulations are conducted to examine the factors that affect the error, including the spectral filtering, the number of samples, and the accumulation of integration error. / Doctor of Philosophy / As data-driven modeling is becoming more prevalent for representing uncertain dynamical systems, concerns also arise regarding the reliability of these methods. This dissertation employs recent developments in approximation theory to provide rigorous error analysis for two certain kernel-based approaches for modeling dynamical systems. The reproducing kernel Hilbert space (RKHS) embedding method is a non-parametric extension of the classical adaptive estimation for identifying uncertain functions in nonlinear systems. By embedding the uncertain function in a properly selected RKHS, the nonlinear state equation in Euclidean space is transformed into a linear evolution in an infinite-dimensional RKHS, where the function estimation error can be characterized directly and precisely. Pointwise convergence of the function estimate is proven over the domain that is persistently excited (PE). And a finite-dimensional approximation can be constructed within an arbitrarily small error bound. The empirical-analytical Lagrangian (EAL) model is developed to approximate the trajectory of Lagrangian systems with uncertain configuration manifold. Employing the kernel method, a penalty potential is constructed from the observation data to ``push'' the trajectory towards the actual configuration manifold. A probabilistic error bound is derived for the distance of the approximated trajectory away from the actual manifold. The error bound is proven to contain a bias term and a variance term, both of which are determined by the parameters of the kernel method.
Identifer | oai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/101902 |
Date | 14 January 2021 |
Creators | Guo, Jia |
Contributors | Mechanical Engineering, Kurdila, Andrew J., Burns, John A., Leonessa, Alexander, Woolsey, Craig A., Southward, Steve C. |
Publisher | Virginia Tech |
Source Sets | Virginia Tech Theses and Dissertation |
Detected Language | English |
Type | Dissertation |
Format | ETD, application/pdf, application/pdf |
Rights | In Copyright, http://rightsstatements.org/vocab/InC/1.0/ |
Page generated in 0.0024 seconds