• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1243
  • 305
  • 123
  • 101
  • 67
  • 60
  • 42
  • 24
  • 22
  • 18
  • 14
  • 13
  • 8
  • 7
  • 7
  • Tagged with
  • 2438
  • 886
  • 407
  • 338
  • 306
  • 245
  • 239
  • 205
  • 197
  • 194
  • 178
  • 171
  • 170
  • 152
  • 148
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Parameter Estimation Using Consensus Building Strategies with Application to Sensor Networks

Dasgupta, Kaushani 12 1900 (has links)
Sensor network plays a significant role in determining the performance of network inference tasks. A wireless sensor network with a large number of sensor nodes can be used as an effective tool for gathering data in various situations. One of the major issues in WSN is developing an efficient protocol which has a significant impact on the convergence of the network. Parameter estimation is one of the most important applications of sensor network. In order to model such large and complex networks for estimation, efficient strategies and algorithms which take less time to converge are being developed. To deal with this challenge, an approach of having multilayer network structure to estimate parameter and reach convergence in less time is estimated by comparing it with known gossip distributed algorithm. Approached Multicast multilayer algorithm on a network structure of Gaussian mixture model with two components to estimate parameters were compared and simulated with gossip algorithm. Both the algorithms were compared based on the number of iterations the algorithms took to reach convergence by using Expectation Maximization Algorithm.Finally a series of theoretical and practical results that explicitly showed that Multicast works better than gossip in large and complex networks for estimation in consensus building strategies.
82

Probing the early universe and dark energy with multi-epoch cosmological data

Hlozek, Renee Alexandra January 2012 (has links)
Contemporary cosmology is a vibrant field, with data and observations increasing rapidly. This allows for accurate estimation of the parameters describing our cosmological model. In this thesis we present new research based on two different types of cosmological observations, which probe the universe at multiple epochs. We begin by reviewing the current concordance cosmological paradigm, and the statistical tools used to perform parameter estimation from cosmological data. We highlight the initial conditions in the universe and how they are detectable using the Cosmic Microwave Background radiation. We present the angular power spectrum data from temperature observations made with the Atacama Cosmology Telescope (ACT) and the methods used to estimate the power spectrum from temperature maps of the sky. We then present a cosmological analysis using the ACT data in combination with observations from the Wilkinson Microwave Anisotropy Probe to constrain parameters such as the effective number of relativistic species and the spectral index of the primordial power spectrum, which we constrain to deviate from scale invariance at the 99% confidence limit. We then use this combined dataset to constrain the primordial power spectrum in a minimally parametric framework, finding no evidence for deviation from a power-law spectrum. Finally we present Bayesian Estimation Applied to Multiple Species, a parameter estimation technique using photometric Type Ia Supernova data to estimate cosmological parameters in the presence of contaminated data. We apply this algorithm to the full season of the Sloan Digital Sky Survey II Supernova Search, and find that the constraints are improved by a factor of three relative to the case where one uses a smaller, spectroscopically confirmed subset of supernovae.
83

Parameter study for WinDAM using DAKOTA

Bhat, Ashwin Ramachandra January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Mitchell L. Neilsen / Windows[superscript TM] Dam Analysis Modules (WinDAM) is a set of modular software components that can be used to analyze overtopped earthen embankments and internal erosion of embankment dams. Sandia National Laboratories’ DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides algorithms to perform iterative analysis with the help of built-in algorithms for uncertainty quantification with sampling and parameter study methods. This software integrates Sandia National Laboratories’ DAKOTA software suite with WinDAM. It provides a user-interface for input and manipulation of parameters and performs various (centered and multi-dimensional) parameter studies on a wide range of parameters. This software takes advantage of the various algorithms present in DAKOTA to perform parameter studies on the various properties of WinDAM and provides users with detailed output on the changes caused by these variations.
84

Bayesian extreme quantile regression for hidden Markov models

Koutsourelis, Antonios January 2012 (has links)
The main contribution of this thesis is the introduction of Bayesian quantile regression for hidden Markov models, especially when we have to deal with extreme quantile regression analysis, as there is a limited research to inference conditional quantiles for hidden Markov models, under a Bayesian approach. The first objective is to compare Bayesian extreme quantile regression and the classical extreme quantile regression, with the help of simulated data generated by three specific models, which only differ in the error term’s distribution. It is also investigated if and how the error term’s distribution affects Bayesian extreme quantile regression, in terms of parameter and confidence intervals estimation. Bayesian extreme quantile regression is performed by implementing a Metropolis-Hastings algorithm to update our parameters, while the classical extreme quantile regression is performed by using linear programming. Moreover, the same analysis and comparison is performed on a real data set. The results provide strong evidence that our method can be improved, by combining MCMC algorithms and linear programming, in order to obtain better parameter and confidence intervals estimation. After improving our method for Bayesian extreme quantile regression, we extend it by including hidden Markov models. First, we assume a discrete time finite state-space hidden Markov model, where the distribution associated with each hidden state is a) a Normal distribution and b) an asymmetric Laplace distribution. Our aim is to explore the number of hidden states that describe the extreme quantiles of our data sets and check whether a different distribution associated with each hidden state can affect our estimation. Additionally, we also explore whether there are structural changes (breakpoints), by using break-point hidden Markov models. In order to perform this analysis we implement two new MCMC algorithms. The first one updates the parameters and the hidden states by using a Forward-Backward algorithm and Gibbs sampling (when a Normal distribution is assumed), and the second one uses a Forward-Backward algorithm and a mixture of Gibbs and Metropolis-Hastings sampling (when an asymmetric Laplace distribution is assumed). Finally, we consider hidden Markov models, where the hidden state (latent variables) are continuous. For this case of the discrete-time continuous state-space hidden Markov model we implement a method that uses linear programming and the Kalman filter (and Kalman smoother). Our methods are used in order to analyze real interest rates by assuming hidden states, which represent different financial regimes. We show that our methods work very well in terms of parameter estimation and also in hidden state and break-point estimation, which is very useful for the real life applications of those methods.
85

Development, validation, and application of new relativistic methods for all-electron unrestricted two-component calculations of EPR parameters / Entwicklung, Validierung und Anwendung von neuen relativistichen Methoden für spin-polarisierte zweikomponentige Allelektronen-Berechnungen von EPR Parametern.

Malkin Ondik, Irina January 2006 (has links) (PDF)
In this work we have developed the method of back-transfoprmation within the Douglas-Kroll-Hess (DKH) framework, which has simplified the picture-change consistent transformation of first-order property operators in the DKH approach, making the implementation feasible. This has enabled us to implement the first all-electron scalar relativistic calculations of hyperfine coupling tensors at DKH2 level. Furthemore we have presented a general, relativistic two-component DFT approach for the unrestricted calculations of electronic g-tensors, based on DKH Hamiltonian. Additionally we have derived the expressions for the evaluation of hyperfine structurs and two-component unrestricted treatment of g-tensor within the Resolution of Identity Dirac Kohn Sham method developed by Stanoslav Komorovsky and Michal Repisky in collaboration with other members of the group of V. G. Malkin. All these approaches have been extensively validated. / In dieser Arbeit entwickelten wir ein Rücktransformations-Verfahren, das im Rahmen der relativistischen Douglas-Kroll-Hess (DKH) Methode die Picture-Change-konsistente Transformation von Eigenschaftsoperatoren erster Ordnung vereinfacht. Dies ermöglichte uns, die ersten skalar-relativistischen Allelektronen-Berechnungen von Hyperfinekopplungskonstanten auf DKH2-Niveau zu implementieren. Darüber hinaus entwickelten wir eine allgemeine relativistische zweikomponentige Methode für spin-polarisierte Berechnungen von elektronischen g-Tensoren. Zusätzlich leiteten wir die Gleichungen für die Berechnungen der von Stanislav Komorovsky und Michal Repisky entwickelten Resolution-of-Identity-Dirac-Kohn-Sham-Methode her. Alle diese Verfahren wurden in umfangreichen Studien validiert.
86

Estimation of polychoric correlation with non-normal latent variables.

January 1987 (has links)
by Ming-long Lam. / Thesis (M.Ph.)--Chinese University of Hong Kong, 1987. / Bibliography: leaves 41-43.
87

Multilevel analysis of structural equation models.

January 1991 (has links)
by Linda Hoi-ying Yau. / Thesis (M.Phil.)--Chinese University of Hong Kong, 1991. / Includes bibliographical references. / Chapter Chapter 1 --- Preliminary / Chapter § 1.1 --- Introduction page --- p.1 / Chapter § 1.2 --- Notations page --- p.3 / Chapter Chapter 2 --- Multilevel Analysis of Structural Equation Models with Multivariate Normal Distribution / Chapter § 2.1 --- The Multilevel Structural Equation Model page --- p.4 / Chapter § 2.2 --- "First Stage Estimation of and Σkmkm-1---ki+1wo for i=1,...,m-1 page" --- p.5 / Chapter § 2:3 --- Second Stage Estimation of Structural Parameters page --- p.10 / Chapter Chapter 3 --- Generalization to Arbitrary and Elliptical Distributions / Chapter § 3.1 --- Asymptotically Distribution-Free Estimation page --- p.25 / Chapter § 3.2 --- Elliptical Distribution Estimation page --- p.30 / Chapter Chapter 4 --- Artificial Examples / Chapter § 4.1 --- Examples on Multivariate Normal Distribution Estimation Page --- p.34 / Chapter § 4.2 --- Examples on Elliptical Distribution Estimation page --- p.40 / Chapter §4.3 --- Findings and Summary Page --- p.42 / Chapter Chapter 5 --- Conclusion and Discussion page --- p.44 / References page --- p.47 / Figure 1 page --- p.49 / Appendices page --- p.50 / Tables Page --- p.59
88

Extraction of superelastic parameter values from instrumented indentation data

Roberto-Pereira, Francisco Fernando January 2019 (has links)
Interest in superelastic (and shape memory) materials continues to rise, and there is a strong incentive to develop techniques for monitoring of their superelastic characteristics. This is conventionally done via uniaxial testing, but there are many advantages to having a capability for obtaining these characteristics (in the form of parameter values in a constitutive law) via indentation testing. Specimens can then be small, require minimal preparation and be obtainable from components in service. Interrogation of small volumes also allows mapping of properties over a surface. On the other hand, the tested volume must be large enough for its response to be representative of behaviour. Precisely the same arguments apply to more "mainstream" mechanical properties, such as yielding and work hardening characteristics. Indeed, there has been considerable progress in that area recently, using FEM simulation to predict indentation outcomes, evaluating the "goodness of fit" for particular sets of parameter values and converging on a best-fit combination. A similar approach can be used to obtain superelastic parameters, but little work has been done hitherto on sensitivities, uniqueness characteristics or optimal methodologies and the procedures are complicated by limitations to the constitutive laws in current use. The current work presents a comprehensive examination of the issues involved, using experimental (uniaxial and indentation) data for a NiTi Shape Memory Alloy. It was found that it is possible to obtain the superelastic parameter values using a single indenter shape (spherical). Information is also presented on sensitivities and the probable reliability of such parameters obtained in this way for an unknown material.
89

Identificação de parâmetros em obras civis. / Parameter identification in civil structures.

Costa, Adriane 18 May 2006 (has links)
O problema de identificação de parâmetros consiste em se determinar parâmetros que minimizem a diferença entre valores medidos e calculados de determinadas grandezas. Certamente, essa identificação é realizada para parâmetros que apresentam razoável grau de incerteza nos seus valores. Neste trabalho apresentam-se os principais conceitos e fundamentos matemáticos envolvidos no assunto, desenvolve-se um procedimento de identificação de parâmetros com base matemática sólida e aplica-se esse procedimento em problemas de interesse prático da engenharia. São estudados o Túnel de Hudvudsta e a barragem de Machadinho, nos quais são identificados parâmetros relacionados com as ações ou com as propriedades físicas dos materiais, considerando modelos hierárquicos para representar as estruturas. Utilizam-se os principais critérios de identificação para a definição das funções objetivo e métodos do tipo Newton para a minimização dessas funções. / The parameter identification problem consists of determining the values of the parameters that minimize the difference between measured and calculated values of some variables. Indeed, this identification is performed to parameters that present some uncertainty on their values. In this work the main mathematical concepts and fundaments related to back analysis are presented. A procedure for parameter identification with a consistent mathematical basis is developed and applied in practical engineering problems. The Hudvudsta tunnel and the Machadinho dam are studied to identify parameters related to loads or material physical properties by using hierarchical models to represent the structures. The objetive functions are defined with the main identification criteria and minimized with Newton´s methods.
90

On the stability of sequential Monte Carlo methods for parameter estimation

Kuhlenschmidt, Bernd January 2015 (has links)
No description available.

Page generated in 0.0495 seconds