• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Parametric covariance assignment using a reduced-order closed-form covariance model

Zhang, Qichun, Wang, Z., Wang, H. 03 October 2019 (has links)
Yes / This paper presents a novel closed-form covariance model using covariance matrix decomposition for both continuous-time and discrete-time stochastic systems which are subjected to Gaussian noises. Different from the existing covariance models, it has been shown that the order of the presented model can be reduced to the order of original systems and the parameters of the model can be obtained by Kronecker product and Hadamard product which imply a uniform expression. Furthermore, the associated controller design can be simplified due to the use of the reduced-order structure of the model. Based on this model, the state and output covariance assignment algorithms have been developed with parametric state and output feedback, where the computational complexity is reduced and the extended free parameters of parametric feedback supply flexibility to the optimization. As an extension, the reduced-order closed-form covariance model for stochastic systems with parameter uncertainties is also presented in this paper. A simulated example is included to show the effectiveness of the proposed control algorithm, where encouraging results have been obtained. / National Natural Science Foundation of China [grant number 61573022], [grant number 61290323] and [grant number 61333007]
2

Generalized Hebbian Algorithm for Dimensionality Reduction in Natural Language Processing

Gorrell, Genevieve January 2006 (has links)
The current surge of interest in search and comparison tasks in natural language processing has brought with it a focus on vector space approaches and vector space dimensionality reduction techniques. Presenting data as points in hyperspace provides opportunities to use a variety of welldeveloped tools pertinent to this representation. Dimensionality reduction allows data to be compressed and generalised. Eigen decomposition and related algorithms are one category of approaches to dimensionality reduction, providing a principled way to reduce data dimensionality that has time and again shown itself capable of enabling access to powerful generalisations in the data. Issues with the approach, however, include computational complexity and limitations on the size of dataset that can reasonably be processed in this way. Large datasets are a persistent feature of natural language processing tasks. This thesis focuses on two main questions. Firstly, in what ways can eigen decomposition and related techniques be extended to larger datasets? Secondly, this having been achieved, of what value is the resulting approach to information retrieval and to statistical language modelling at the ngram level? The applicability of eigen decomposition is shown to be extendable through the use of an extant algorithm; the Generalized Hebbian Algorithm (GHA), and the novel extension of this algorithm to paired data; the Asymmetric Generalized Hebbian Algorithm (AGHA). Several original extensions to the these algorithms are also presented, improving their applicability in various domains. The applicability of GHA to Latent Semantic Analysisstyle tasks is investigated. Finally, AGHA is used to investigate the value of singular value decomposition, an eigen decomposition variant, to ngram language modelling. A sizeable perplexity reduction is demonstrated.

Page generated in 0.071 seconds