• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 170
  • 17
  • 15
  • 11
  • 10
  • 8
  • 8
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 333
  • 333
  • 333
  • 333
  • 145
  • 78
  • 73
  • 54
  • 47
  • 46
  • 43
  • 42
  • 42
  • 31
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Evaluation of Uncertainty in Hydrodynamic Modeling

Camacho Rincon, Rene Alexander 17 August 2013 (has links)
Uncertainty analysis in hydrodynamic modeling is useful to identify and report the limitations of a model caused by different sources of error. In the practice, the main sources of errors are divided into model structure errors, errors in the input data due to measurement imprecision among other, and parametric errors resulting from the difficulty of identifying physically representative parameter values valid at the temporal and spatial scale of the models. This investigation identifies, implements, evaluates, and recommends a set of methods for the evaluation of model structure uncertainty, parametric uncertainty, and input data uncertainty in hydrodynamic modeling studies. A comprehensive review of uncertainty analysis methods is provided and a set of widely applied methods is selected and implemented in real case studies identifying the main limitations and benefits of their use in hydrodynamic studies. In particular, the following methods are investigated: the First Order Variance Analysis (FOVA) method, the Monte Carlo Uncertainty Analysis (MCUA) method, the Bayesian Monte Carlo (BMC) method, the Markov Chain Monte Carlo (MCMC) method and the Generalized Likelihood Uncertainty Estimation (GLUE) method. The results of this investigation indicate that the uncertainty estimates computed with FOVA are consistent with the results obtained by MCUA. In addition, the comparison of BMC, MCMC and GLUE indicates that BMC and MCMC provide similar estimations of the posterior parameter probability distributions, single-point parameter values, and uncertainty bounds mainly due to the use of the same likelihood function, and the low number of parameters involved in the inference process. However, the implementation of MCMC is substantially more complex than the implementation of BMC given that its sampling algorithm requires a careful definition of auxiliary proposal probability distributions along with their variances to obtain parameter samples that effectively belong to the posterior parameter distribution. The analysis also suggest that the results of GLUE are inconsistent with the results of BMC and MCMC. It is concluded that BMC is a powerful and parsimonious strategy for evaluation of all the sources of uncertainty in hydrodynamic modeling. Despites of the computational requirements of BMC, the method can be easily implemented in most practical applications.
192

Modeling spatial patterns of mixed-species Appalachian forests with Gibbs point processes

Packard, Kevin Carew 02 April 2009 (has links)
Stochastic point processes and associated methodology provide a means for the statistical analysis and modeling of the spatial point pattern formed from forest tree stem locations. Stochastic Gibbs point processes were explored as models that could simulate short-range clustering arising from reproduction of trees by stump sprouting, and intermediate-range inhibition of trees that may result from competition for light and growing space. This study developed and compared three pairwise interaction processes with parametric models for 2nd-order potentials and three triplets processes with models for 2nd- and 3rd-order potentials applied to a mixed-species hardwood forest in the Southern Appalachian Mountains of western North Carolina. Although the 2nd-order potentials of both the pairwise interaction and triplets processes were allowed to be purely or partially attractive, the proposed Gibbs point process models were demonstrated to be locally stable. The proposed Gibbs point processes were simulated using Markov Chain Monte Carlo (MCMC) methods; in particular, a reversible-jump Metropolis-Hastings algorithm with birth, death, and shift proposals was utilized. Parameters for the models were estimated by a Bayesian inferential procedure that utilizes MCMC methods to draw samples from the Gibbs posterior density. Two Metropolis-Hastings algorithms that do this sampling were compared; one that estimated ratios of intractable normalizing constants of the Gibbs likelihood by importance sampling and another that introduced an auxiliary variable to cancel the normalizing constants with those in the auxiliary variable's proposal distribution. Results from this research indicated that attractive pairwise interaction models easily degenerate into excessively clustered patterns, whereas triplets processes with attractive 2nd-order and repulsive 3rd-order interactions are more robust against excessive clustering. Bayesian inference for the proposed triplets models was found to be very computationally expensive. Slow mixing of both algorithms used for the inference combined with the long iteration times limited the practicality of the Bayesian approach. However the results obtained here indicate that triplets processes can be used to draw inference for and simulate patterns of mixed-species Appalachian hardwood forests. / Ph. D.
193

Relative Role of Uncertainty for Predictions of Future Southeastern U.S. Pine Carbon Cycling

Jersild, Annika Lee 06 July 2016 (has links)
Predictions of how forest productivity and carbon sequestration will respond to climate change are essential for making forest management decisions and adapting to future climate. However, current predictions can include considerable uncertainty that is not well quantified. To address the need for better quantification of uncertainty, we calculated and compared ecosystem model parameter, ecosystem model process, climate model, and climate scenario uncertainty for predictions of Southeastern U.S. pine forest productivity. We applied a data assimilation using Metropolis-Hastings Markov Chain Monte Carlo to fuse diverse datasets with the Physiological Principles Predicting Growth model. The spatially and temporally diverse data sets allowed for novel constraints on ecosystem model parameters and allowed for the quantification of uncertainty associated with parameterization and model structure (process). Overall, we found that the uncertainty is higher for parameter and process model uncertainty than the climate model uncertainty. We determined that climate change will result in a likely increase in terrestrial carbon storage and that higher emission scenarios increase the uncertainty in our predictions. In addition, we determined regional variations in biomass accumulation due to a response to the change in frost days, temperature, and vapor pressure deficit. Since the uncertainty associated with ecosystem model parameter and process uncertainty was larger than the uncertainty associated with climate predictions, our results indicate that better constraining parameters in ecosystem models and improving the mathematical structure of ecosystem models can improve future predictions of forest productivity and carbon sequestration. / Master of Science
194

Time-Varying Coefficient Models for Recurrent Events

Liu, Yi 14 November 2018 (has links)
I have developed time-varying coefficient models for recurrent event data to evaluate the temporal profiles for recurrence rate and covariate effects. There are three major parts in this dissertation. The first two parts propose a mixed Poisson process model with gamma frailties for single type recurrent events. The third part proposes a Bayesian joint model based on multivariate log-normal frailties for multi-type recurrent events. In the first part, I propose an approach based on penalized B-splines to obtain smooth estimation for both time-varying coefficients and the log baseline intensity. An EM algorithm is developed for parameter estimation. One issue with this approach is that the estimating procedure is conditional on smoothing parameters, which have to be selected by cross-validation or optimizing certain performance criterion. The procedure can be computationally demanding with a large number of time-varying coefficients. To achieve objective estimation of smoothing parameters, I propose a mixed-model representation approach for penalized splines. Spline coefficients are treated as random effects and smoothing parameters are to be estimated as variance components. An EM algorithm embedded with penalized quasi-likelihood approximation is developed to estimate the model parameters. The third part proposes a Bayesian joint model with time-varying coefficients for multi-type recurrent events. Bayesian penalized splines are used to estimate time-varying coefficients and the log baseline intensity. One challenge in Bayesian penalized splines is that the smoothness of a spline fit is considerably sensitive to the subjective choice of hyperparameters. I establish a procedure to objectively determine the hyperparameters through a robust prior specification. A Markov chain Monte Carlo procedure based on Metropolis-adjusted Langevin algorithms is developed to sample from the high-dimensional distribution of spline coefficients. The procedure includes a joint sampling scheme to achieve better convergence and mixing properties. Simulation studies in the second and third part have confirmed satisfactory model performance in estimating time-varying coefficients under different curvature and event rate conditions. The models in the second and third part were applied to data from a commercial truck driver naturalistic driving study. The application results reveal that drivers with 7-hours-or-less sleep prior to a shift have a significantly higher intensity after 8 hours of on-duty driving and that their intensity remains higher after taking a break. In addition, the results also show drivers' self-selection on sleep time, total driving hours in a shift, and breaks. These applications provide crucial insight into the impact of sleep time on driving performance for commercial truck drivers and highlights the on-road safety implications of insufficient sleep and breaks while driving. This dissertation provides flexible and robust tools to evaluate the temporal profile of intensity for recurrent events. / PHD / The overall objective of this dissertation is to develop models to evaluate the time-varying profiles for event occurrences and the time-varying effects of risk factors upon event occurrences. There are three major parts in this dissertation. The first two parts are designed for single event type. They are based on approaches such that the whole model is conditional on a certain kind of tuning parameter. The value of this tuning parameter has to be pre-specified by users and is influential to the model results. Instead of pre-specifying the value, I develop an approach to achieve an objective estimate for the optimal value of tuning parameter and obtain model results simultaneously. The third part proposes a model for multi-type events. One challenge is that the model results are considerably sensitive to the subjective choice of hyperparameters. I establish a procedure to objectively determine the hyperparameters. Simulation studies have confirmed satisfactory model performance in estimating the temporal profiles for both event occurrences and effects of risk factors. The models were applied to data from a commercial truck driver naturalistic driving study. The results reveal that drivers with 7-hours-or-less sleep prior to a shift have a significantly higher intensity after 8 hours of on-duty driving and that their driving risk remains higher after taking a break. In addition, the results also show drivers’ self-selection on sleep time, total driving hours in a shift, and breaks. These applications provide crucial insight into the impact of sleep time on driving performance for commercial truck drivers and highlights the on-road safety implications of insufficient sleep and breaks while driving. This dissertation provides flexible and robust tools to evaluate the temporal profile of both event occurrences and effects of risk factors.
195

Contributions to the theory of unequal probability sampling

Lundquist, Anders January 2009 (has links)
This thesis consists of five papers related to the theory of unequal probability sampling from a finite population. Generally, it is assumed that we wish to make modelassisted inference, i.e. the inclusion probability for each unit in the population is prescribed before the sample is selected. The sample is then selected using some random mechanism, the sampling design. Mostly, the thesis is focused on three particular unequal probability sampling designs, the conditional Poisson (CP-) design, the Sampford design, and the Pareto design. They have different advantages and drawbacks: The CP design is a maximum entropy design but it is difficult to determine sampling parameters which yield prescribed inclusion probabilities, the Sampford design yields prescribed inclusion probabilities but may be hard to sample from, and the Pareto design makes sample selection very easy but it is very difficult to determine sampling parameters which yield prescribed inclusion probabilities. These three designs are compared probabilistically, and found to be close to each other under certain conditions. In particular the Sampford and Pareto designs are probabilistically close to each other. Some effort is devoted to analytically adjusting the CP and Pareto designs so that they yield inclusion probabilities close to the prescribed ones. The result of the adjustments are in general very good. Some iterative procedures are suggested to improve the results even further. Further, balanced unequal probability sampling is considered. In this kind of sampling, samples are given a positive probability of selection only if they satisfy some balancing conditions. The balancing conditions are given by information from auxiliary variables. Most of the attention is devoted to a slightly less general but practically important case. Also in this case the inclusion probabilities are prescribed in advance, making the choice of sampling parameters important. A complication which arises in the context of choosing sampling parameters is that certain probability distributions need to be calculated, and exact calculation turns out to be practically impossible, except for very small cases. It is proposed that Markov Chain Monte Carlo (MCMC) methods are used for obtaining approximations to the relevant probability distributions, and also for sample selection. In general, MCMC methods for sample selection does not occur very frequently in the sampling literature today, making it a fairly novel idea.
196

A Bayesian approach to initial model inference in cryo-electron microscopy

Joubert, Paul 04 March 2016 (has links)
Eine Hauptanwendung der Einzelpartikel-Analyse in der Kryo-Elektronenmikroskopie ist die Charakterisierung der dreidimensionalen Struktur makromolekularer Komplexe. Dazu werden zehntausende Bilder verwendet, die verrauschte zweidimensionale Projektionen des Partikels zeigen. Im ersten Schritt werden ein niedrig aufgelöstetes Anfangsmodell rekonstruiert sowie die unbekannten Bildorientierungen geschätzt. Dies ist ein schwieriges inverses Problem mit vielen Unbekannten, einschließlich einer unbekannten Orientierung für jedes Projektionsbild. Ein gutes Anfangsmodell ist entscheidend für den Erfolg des anschließenden Verfeinerungsschrittes. Meine Dissertation stellt zwei neue Algorithmen zur Rekonstruktion eines Anfangsmodells in der Kryo-Elektronenmikroskopie vor, welche auf einer groben Darstellung der Elektronendichte basieren. Die beiden wesentlichen Beiträge meiner Arbeit sind zum einen das Modell, welches die Elektronendichte darstellt, und zum anderen die neuen Rekonstruktionsalgorithmen. Der erste Hauptbeitrag liegt in der Verwendung Gaußscher Mischverteilungen zur Darstellung von Elektrondichten im Rekonstruktionsschritt. Ich verwende kugelförmige Mischungskomponenten mit unbekannten Positionen, Ausdehnungen und Gewichtungen. Diese Darstellung hat viele Vorteile im Vergleich zu einer gitterbasierten Elektronendichte, die andere Rekonstruktionsalgorithmen üblicherweise verwenden. Zum Beispiel benötigt sie wesentlich weniger Parameter, was zu schnelleren und robusteren Algorithmen führt. Der zweite Hauptbeitrag ist die Entwicklung von Markovketten-Monte-Carlo-Verfahren im Rahmen eines Bayes'schen Ansatzes zur Schätzung der Modellparameter. Der erste Algorithmus kann aus dem Gibbs-Sampling, welches Gaußsche Mischverteilungen an Punktwolken anpasst, abgeleitet werden. Dieser Algorithmus wird hier so erweitert, dass er auch mit Bildern, Projektionen sowie unbekannten Drehungen und Verschiebungen funktioniert. Der zweite Algorithmus wählt einen anderen Zugang. Das Vorwärtsmodell nimmt nun Gaußsche Fehler an. Sampling-Algorithmen wie Hamiltonian Monte Carlo (HMC) erlauben es, die Positionen der Mischungskomponenten und die Bildorientierungen zu schätzen. Meine Dissertation zeigt umfassende numerische Experimente mit simulierten und echten Daten, die die vorgestellten Algorithmen in der Praxis testen und mit anderen Rekonstruktionsverfahren vergleichen.
197

An integrated approach to feature compensation combining particle filters and Hidden Markov Models for robust speech recognition

Mushtaq, Aleem 19 September 2013 (has links)
The performance of automatic speech recognition systems often degrades in adverse conditions where there is a mismatch between training and testing conditions. This is true for most modern systems which employ Hidden Markov Models (HMMs) to decode speech utterances. One strategy is to map the distorted features back to clean speech features that correspond well to the features used for training of HMMs. This can be achieved by treating the noisy speech as the distorted version of the clean speech of interest. Under this framework, we can track and consequently extract the underlying clean speech from the noisy signal and use this derived signal to perform utterance recognition. Particle filter is a versatile tracking technique that can be used where often conventional techniques such as Kalman filter fall short. We propose a particle filters based algorithm to compensate the corrupted features according to an additive noise model incorporating both the statistics from clean speech HMMs and observed background noise to map noisy features back to clean speech features. Instead of using specific knowledge at the model and state levels from HMMs which is hard to estimate, we pool model states into clusters as side information. Since each cluster encompasses more statistics when compared to the original HMM states, there is a higher possibility that the newly formed probability density function at the cluster level can cover the underlying speech variation to generate appropriate particle filter samples for feature compensation. Additionally, a dynamic joint tracking framework to monitor the clean speech signal and noise simultaneously is also introduced to obtain good noise statistics. In this approach, the information available from clean speech tracking can be effectively used for noise estimation. The availability of dynamic noise information can enhance the robustness of the algorithm in case of large fluctuations in noise parameters within an utterance. Testing the proposed PF-based compensation scheme on the Aurora 2 connected digit recognition task, we achieve an error reduction of 12.15% from the best multi-condition trained models using this integrated PF-HMM framework to estimate the cluster-based HMM state sequence information. Finally, we extended the PFC framework and evaluated it on a large-vocabulary recognition task, and showed that PFC works well for large-vocabulary systems also.
198

On labour market discrimination against Roma in South East Europe

Milcher, Susanne, Fischer, Manfred M. 10 1900 (has links) (PDF)
This paper directs interest on country-specific labour market discrimination Roma may suffer in South East Europe. The study lies in the tradition of statistical Blinder-Oaxaca decomposition analysis. We use microdata from UNDP's 2004 survey of Roma minorities, and apply a Bayesian approach, proposed by Keith and LeSage (2004), for the decomposition analysis of wage differentials. This approach is based on a robust Bayesian heteroscedastic linear regression model in conjunction with Markov Chain Monte Carlo (MCMC) estimation. The results obtained indicate the presence of labour market discrimination in Albania and Kosovo, but point to its absence in Bulgaria, Croatia, and Serbia. (authors' abstract)
199

Data-Adaptive Multivariate Density Estimation Using Regular Pavings, With Applications to Simulation-Intensive Inference

Harlow, Jennifer January 2013 (has links)
A regular paving (RP) is a finite succession of bisections that partitions a multidimensional box into sub-boxes using a binary tree-based data structure, with the restriction that an existing sub-box in the partition may only be bisected on its first widest side. Mapping a real value to each element of the partition gives a real-mapped regular paving (RMRP) that can be used to represent a piecewise-constant function density estimate on a multidimensional domain. The RP structure allows real arithmetic to be extended to density estimates represented as RMRPs. Other operations such as computing marginal and conditional functions can also be carried out very efficiently by exploiting these arithmetical properties and the binary tree structure. The purpose of this thesis is to explore the potential for density estimation using RPs. The thesis is structured in three parts. The first part formalises the operational properties of RP-structured density estimates. The next part considers methods for creating a suitable RP partition for an RMRP-structured density estimate. The advantages and disadvantages of a Markov chain Monte Carlo algorithm, already developed, are investigated and this is extended to include a semi-automatic method for heuristic diagnosis of convergence of the chain. An alternative method is also proposed that uses an RMRP to approximate a kernel density estimate. RMRP density estimates are not differentiable and have slower convergence rates than good multivariate kernel density estimators. The advantages of an RMRP density estimate relate to its operational properties. The final part of this thesis describes a new approach to Bayesian inference for complex models with intractable likelihood functions that exploits these operational properties.
200

Application of Bayesian Inference Techniques for Calibrating Eutrophication Models

Zhang, Weitao 26 February 2009 (has links)
This research aims to integrate mathematical water quality models with Bayesian inference techniques for obtaining effective model calibration and rigorous assessment of the uncertainty underlying model predictions. The first part of my work combines a Bayesian calibration framework with a complex biogeochemical model to reproduce oligo-, meso- and eutrophic lake conditions. The model accurately describes the observed patterns and also provides realistic estimates of predictive uncertainty for water quality variables. The Bayesian estimations are also used for appraising the exceedance frequency and confidence of compliance of different water quality criteria. The second part introduces a Bayesian hierarchical framework (BHF) for calibrating eutrophication models at multiple systems (or sites of the same system). The models calibrated under the BHF provided accurate system representations for all the scenarios examined. The BHF allows overcoming problems of insufficient local data by “borrowing strength” from well-studied sites. Both frameworks can facilitate environmental management decisions.

Page generated in 0.0766 seconds