• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Learning Stochastic Nonlinear Dynamical Systems Using Non-stationary Linear Predictors

Abdalmoaty, Mohamed January 2017 (has links)
The estimation problem of stochastic nonlinear parametric models is recognized to be very challenging due to the intractability of the likelihood function. Recently, several methods have been developed to approximate the maximum likelihood estimator and the optimal mean-square error predictor using Monte Carlo methods. Albeit asymptotically optimal, these methods come with several computational challenges and fundamental limitations. The contributions of this thesis can be divided into two main parts. In the first part, approximate solutions to the maximum likelihood problem are explored. Both analytical and numerical approaches, based on the expectation-maximization algorithm and the quasi-Newton algorithm, are considered. While analytic approximations are difficult to analyze, asymptotic guarantees can be established for methods based on Monte Carlo approximations. Yet, Monte Carlo methods come with their own computational difficulties; sampling in high-dimensional spaces requires an efficient proposal distribution to reduce the number of required samples to a reasonable value. In the second part, relatively simple prediction error method estimators are proposed. They are based on non-stationary one-step ahead predictors which are linear in the observed outputs, but are nonlinear in the (assumed known) input. These predictors rely only on the first two moments of the model and the computation of the likelihood function is not required. Consequently, the resulting estimators are defined via analytically tractable objective functions in several relevant cases. It is shown that, under mild assumptions, the estimators are consistent and asymptotically normal. In cases where the first two moments are analytically intractable due to the complexity of the model, it is possible to resort to vanilla Monte Carlo approximations. Several numerical examples demonstrate a good performance of the suggested estimators in several cases that are usually considered challenging. / <p>QC 20171128</p>

Page generated in 0.0954 seconds