• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 209
  • 191
  • 31
  • 18
  • 12
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 553
  • 553
  • 214
  • 196
  • 106
  • 101
  • 73
  • 67
  • 67
  • 67
  • 66
  • 57
  • 54
  • 50
  • 49
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Essays on Objective Procedures for Bayesian Hypothesis Testing

Namavari, Hamed 01 October 2019 (has links)
No description available.
62

A comprehensive analysis of extreme rainfall

Kagoda, Paulo Abuneeri 13 August 2008 (has links)
No description available.
63

EXPLORATION OF A BAYESIAN MODEL OF TACTILE SPATIAL PERCEPTION / EXPLORATION OF TACTILE SPATIAL PERCEPTION

Dehnadi, Seyedbehrad January 2022 (has links)
The remarkable ability of the human brain to draw an accurate percept from imprecise sensory information is not well understood. Bayesian inference provides an optimal means for drawing perceptual conclusions from sensorineural activity. This approach has frequently been applied to visual and auditory studies but only rarely to studies of tactile perception. We explored whether a Bayesian observer model could replicate fundamental aspects of human tactile spatial perception. The model consisted of an encoder that simulated sensorineural responses with Poisson statistics followed by a decoder that interpreted the observed firing rates. We compared the performance of our Bayesian observer on a battery of tactile tasks to human participant data collected previously by our laboratory and others. The Bayesian observer replicated human performance trends on three spatial acuity tasks: classic two-point discrimination (C2PD), sequential two-point discrimination (S2PD), and two-point orientation discrimination (2POD). We confirmed the widely reported observation that C2PD is the least reliable method of assessing tactile acuity due presumably to the presence of non-spatial cues. Additionally, the Bayesian observer performed similarly to humans on raised letter and Braille character-recognition tasks. The Bayesian observer further replicated two illusions previously reported in humans: an adaptation-induced repulsion illusion and an orientation anisotropy illusion. Taken together, these results suggest that human tactile spatial perception may arise from a Bayesian-like decoder that is unaware of the precise characteristics of its inputs. / Thesis / Master of Science (MSc)
64

Essays on DSGE Models and Bayesian Estimation

Kim, Jae-yoon 11 June 2018 (has links)
This thesis explores the theory and practice of sovereignty. I begin with a conceptual analysis of sovereignty, examining its theological roots in contrast with its later influence in contestations over political authority. Theological debates surrounding God’s sovereignty dealt not with the question of legitimacy, which would become important for political sovereignty, but instead with the limits of his ability. Read as an ontological capacity, sovereignty is coterminous with an existent’s activity in the world. As lived, this capacity is regularly limited by the ways in which space is produced via its representations, its symbols, and its practices. All collective appropriations of space have a nomos that characterizes their practice. Foucault’s account of “biopolitics” provides an account of how contemporary materiality is distributed, an account that can be supplemented by sociological typologies of how city space is typically produced. The collective biopolitical distribution of space expands the range of practices that representationally legibilize activity in the world, thereby expanding the conceptual limits of existents and what it means for them to act up to the borders of their capacity, i.e., to practice sovereignty. The desire for total authorial capacity expresses itself in relations of domination and subordination that never erase the fundamental precarity of subjects, even as these expressions seek to disguise it. I conclude with a close reading of narratives recounting the lives of residents in Chicago’s Englewood, reading their activity as practices of sovereignty which manifest variously as they master and produce space. / Ph. D.
65

Machine Learning and Field Inversion approaches to Data-Driven Turbulence Modeling

Michelen Strofer, Carlos Alejandro 27 April 2021 (has links)
There still is a practical need for improved closure models for the Reynolds-averaged Navier-Stokes (RANS) equations. This dissertation explores two different approaches for using experimental data to provide improved closure for the Reynolds stress tensor field. The first approach uses machine learning to learn a general closure model from data. A novel framework is developed to train deep neural networks using experimental velocity and pressure measurements. The sensitivity of the RANS equations to the Reynolds stress, required for gradient-based training, is obtained by means of both variational and ensemble methods. The second approach is to infer the Reynolds stress field for a flow of interest from limited velocity or pressure measurements of the same flow. Here, this field inversion is done using a Monte Carlo Bayesian procedure and the focus is on improving the inference by enforcing known physical constraints on the inferred Reynolds stress field. To this end, a method for enforcing boundary conditions on the inferred field is presented. The two data-driven approaches explored and improved upon here demonstrate the potential for improved practical RANS predictions. / Doctor of Philosophy / The Reynolds-averaged Navier-Stokes (RANS) equations are widely used to simulate fluid flows in engineering applications despite their known inaccuracy in many flows of practical interest. The uncertainty in the RANS equations is known to stem from the Reynolds stress tensor for which no universally applicable turbulence model exists. The computational cost of more accurate methods for fluid flow simulation, however, means RANS simulations will likely continue to be a major tool in engineering applications and there is still a need for improved RANS turbulence modeling. This dissertation explores two different approaches to use available experimental data to improve RANS predictions by improving the uncertain Reynolds stress tensor field. The first approach is using machine learning to learn a data-driven turbulence model from a set of training data. This model can then be applied to predict new flows in place of traditional turbulence models. To this end, this dissertation presents a novel framework for training deep neural networks using experimental measurements of velocity and pressure. When using velocity and pressure data, gradient-based training of the neural network requires the sensitivity of the RANS equations to the learned Reynolds stress. Two different methods, the continuous adjoint and ensemble approximation, are used to obtain the required sensitivity. The second approach explored in this dissertation is field inversion, whereby available data for a flow of interest is used to infer a Reynolds stress field that leads to improved RANS solutions for that same flow. Here, the field inversion is done via the ensemble Kalman inversion (EKI), a Monte Carlo Bayesian procedure, and the focus is on improving the inference by enforcing known physical constraints on the inferred Reynolds stress field. To this end, a method for enforcing boundary conditions on the inferred field is presented. While further development is needed, the two data-driven approaches explored and improved upon here demonstrate the potential for improved practical RANS predictions.
66

Bayesian Methods for Mineral Processing Operations

Koermer, Scott Carl 07 June 2022 (has links)
Increases in demand have driven the development of complex processing technology for separating mineral resources from exceedingly low grade multi- component resources. Low mineral concentrations and variable feedstocks can make separating signal from noise difficult, while high process complexity and the multi-component nature of a feedstock can make testwork, optimization, and process simulation difficult or infeasible. A prime example of such a scenario is the recovery and separation of rare earth elements (REEs) and other critical minerals from acid mine drainage (AMD) using a solvent extraction (SX) process. In this process the REE concentration found in an AMD source can vary site to site, and season to season. SX processes take a non-trivial amount of time to reach steady state. The separation of numerous individual elements from gangue metals is a high-dimensional problem, and SX simulators can have a prohibitive computation time. Bayesian statistical methods intrinsically quantify uncertainty of model parameters and predictions given a set of data and a prior distribution and model parameter prior distributions. The uncertainty quantification possible with Bayesian methods lend well to statistical simulation, model selection, and sensitivity analysis. Moreover, Bayesian models utilizing Gaussian Process priors can be used for active learning tasks which allow for prediction, optimization, and simulator calibration while reducing data requirements. However, literature on Bayesian methods applied to separations engineering is sparse. The goal of this dissertation is to investigate, illustrate, and test the use of a handful of Bayesian methods applied to process engineering problems. First further details for the background and motivation are provided in the introduction. The literature review provides further information regarding critical minerals, solvent extraction, Bayeisan inference, data reconciliation for separations, and Gaussian process modeling. The body of work contains four chapters containing a mixture of novel applications for Bayesian methods and a novel statistical method derived for the use with the motivating problem. Chapter topics include Bayesian data reconciliation for processes, Bayesian inference for a model intended to aid engineers in deciding if a process has reached steady state, Bayesian optimization of a process with unknown dynamics, and a novel active learning criteria for reducing the computation time required for the Bayesian calibration of simulations to real data. In closing, the utility of a handfull of Bayesian methods are displayed. However, the work presented is not intended to be complete and suggestions for further improvements to the application of Bayesian methods to separations are provided. / Doctor of Philosophy / Rare earth elements (REEs) are a set of elements used in the manufacture of supplies used in green technologies and defense. Demand for REEs has prompted the development of technology for recovering REEs from unconventional resources. One unconventional resource for REEs under investigation is acid mine drainage (AMD) produced from the exposure of certain geologic strata as part of coal mining. REE concentrations found in AMD are significant, although low compared to REE ore, and can vary from site to site and season to season. Solvent extraction (SX) processes are commonly utilized to concentrate and separate REEs from contaminants using the differing solubilities of specific elements in water and oil based liquid solutions. The complexity and variability in the processes used to concentrate REEs from AMD with SX motivates the use of modern statistical and machine learning based approaches for filtering noise, uncertainty quantification, and design of experiments for testwork, in order to find the truth and make accurate process performance comparisons. Bayesian statistical methods intrinsically quantify uncertainty. Bayesian methods can be used to quantify uncertainty for predictions as well as select which model better explains a data set. The uncertainty quantification available with Bayesian models can be used for decision making. As a particular example, the uncertainty quantification provided by Gaussian process regression lends well to finding what experiments to conduct, given an already obtained data set, to improve prediction accuracy or to find an optimum. However, literature is sparse for Bayesian statistical methods applied to separation processes. The goal of this dissertation is to investigate, illustrate, and test the use of a handful of Bayesian methods applied to process engineering problems. First further details for the background and motivation are provided in the introduction. The literature review provides further information regarding critical minerals, solvent extraction, Bayeisan inference, data reconciliation for separations, and Gaussian process modeling. The body of work contains four chapters containing a mixture of novel applications for Bayesian methods and a novel statistical method derived for the use with the motivating problem. Chapter topics include Bayesian data reconciliation for processes, Bayesian inference for a model intended to aid engineers in deciding if a process has reached steady state, Bayesian optimization of a process with unknown dynamics, and a novel active learning criteria for reducing the computation time required for the Bayesian calibration of simulations to real data. In closing, the utility of a handfull of Bayesian methods are displayed. However, the work presented is not intended to be complete and suggestions for further improvements to the application of Bayesian methods to separations are provided.
67

A distribuição normal-valor extremo generalizado para a modelagem de dados limitados no intervalo unitá¡rio (0,1) / The normal-generalized extreme value distribution for the modeling of data restricted in the unit interval (0,1)

Benites, Yury Rojas 28 June 2019 (has links)
Neste trabalho é introduzido um novo modelo estatístico para modelar dados limitados no intervalo continuo (0;1). O modelo proposto é construído sob uma transformação de variáveis, onde a variável transformada é resultado da combinação de uma variável com distribuição normal padrão e a função de distribuição acumulada da distribuição valor extremo generalizado. Para o novo modelo são estudadas suas propriedades estruturais. A nova família é estendida para modelos de regressão, onde o modelo é reparametrizado na mediana da variável resposta e este conjuntamente com o parâmetro de dispersão são relacionados com covariáveis através de uma função de ligação. Procedimentos inferênciais são desenvolvidos desde uma perspectiva clássica e bayesiana. A inferência clássica baseia-se na teoria de máxima verossimilhança e a inferência bayesiana no método de Monte Carlo via cadeias de Markov. Além disso estudos de simulação foram realizados para avaliar o desempenho das estimativas clássicas e bayesianas dos parâmetros do modelo. Finalmente um conjunto de dados de câncer colorretal é considerado para mostrar a aplicabilidade do modelo. / In this research a new statistical model is introduced to model data restricted in the continuous interval (0;1). The proposed model is constructed under a transformation of variables, in which the transformed variable is the result of the combination of a variable with standard normal distribution and the cumulative distribution function of the generalized extreme value distribution. For the new model its structural properties are studied. The new family is extended to regression models, in which the model is reparametrized in the median of the response variable and together with the dispersion parameter are related to covariables through a link function. Inferential procedures are developed from a classical and Bayesian perspective. The classical inference is based on the theory of maximum likelihood, and the Bayesian inference is based on the Markov chain Monte Carlo method. In addition, simulation studies were performed to evaluate the performance of the classical and Bayesian estimates of the model parameters. Finally a set of colorectal cancer data is considered to show the applicability of the model
68

Bayesian inference and wavelet methods in image processing

Silwal, Sharad Deep January 1900 (has links)
Master of Science / Department of Statistics / Diego M. Maldonado / Haiyan Wang / This report addresses some mathematical and statistical techniques of image processing and their computational implementation. Fundamental theories have been presented, applied and illustrated with examples. To make the report as self-contained as possible, key terminologies have been defined and some classical results and theorems are stated, in the most part, without proof. Some algorithms and techniques of image processing have been described and substantiated with experimentation using MATLAB. Several ways of estimating original images from noisy image data and their corresponding risks are discussed. Two image processing concepts selected to illustrate computational implementation are: "Bayes classification" and "Wavelet denoising". The discussion of the latter involves introducing a specialized area of mathematics, namely, wavelets. A self-contained theory for wavelets is built by first reviewing basic concepts of Fourier Analysis and then introducing Multi-resolution Analysis and wavelets. For a better understanding of Fourier Analysis techniques in image processing, original solutions to some problems in Fourier Analysis have been worked out. Finally, implementation of the above-mentioned concepts are illustrated with examples and MATLAB codes.
69

New regression methods for measures of central tendency

Aristodemou, Katerina January 2014 (has links)
Measures of central tendency have been widely used for summarising statistical data, with the mean being the most popular summary statistic. However, in reallife applications it is not always the most representative measure of central location, especially when dealing with data which is skewed or contains outliers. Alternative statistics with less bias are the median and the mode. Median and quantile regression has been used in different fields to examine the effect of factors at different points of the distribution. Mode estimation, on the other hand, has found many applications in cases where the analysis focuses on obtaining information about the most typical value or pattern. This thesis demonstrates that mode also plays an important role in the analysis of big data, which is becoming increasingly important in many sectors of the global economy. However, mode regression has not been widely applied, even though there is a clear conceptual benefit, due to the computational and theoretical limitations of the existing estimators. Similarly, despite the popularity of the binary quantile regression model, computational straight forward estimation techniques do not exist. Driven by the demand for simple, well-found and easy to implement inference tools, this thesis develops a series of new regression methods for mode and binary quantile regression. Chapter 2 deals with mode regression methods from the Bayesian perspective and presents one parametric and two non-parametric methods of inference. Chapter 3 demonstrates a mode-based, fast pattern-identification method for big data and proposes the first fully parametric mode regression method, which effectively uncovers the dependency of typical patterns on a number of covariates. The proposed approach is demonstrated through the analysis of a decade-long dataset on the Body Mass Index and associated factors, taken from the Health Survey for England. Finally, Chapter 4 presents an alternative binary quantile regression approach, based on the nonlinear least asymmetric weighted squares, which can be implemented using standard statistical packages and guarantees a unique solution.
70

Branching Gaussian Process Models for Computer Vision

Simek, Kyle January 2016 (has links)
Bayesian methods provide a principled approach to some of the hardest problems in computer vision—low signal-to-noise ratios, ill-posed problems, and problems with missing data. This dissertation applies Bayesian modeling to infer multidimensional continuous manifolds (e.g., curves, surfaces) from image data using Gaussian process priors. Gaussian processes are ideal priors in this setting, providing a stochastic model over continuous functions while permitting efficient inference. We begin by introducing a formal mathematical representation of branch curvilinear structures called a curve tree and we define a novel family of Gaussian processes over curve trees called branching Gaussian processes. We define two types of branching Gaussian properties and show how to extend them to branching surfaces and hypersurfaces. We then apply Gaussian processes in three computer vision applications. First, we perform 3D reconstruction of moving plants from 2D images. Using a branching Gaussian process prior, we recover high quality 3D trees while being robust to plant motion and camera calibration error. Second, we perform multi-part segmentation of plant leaves from highly occluded silhouettes using a novel Gaussian process model for stochastic shape. Our method obtains good segmentations despite highly ambiguous shape evidence and minimal training data. Finally, we estimate 2D trees from microscope images of neurons with highly ambiguous branching structure. We first fit a tree to a blurred version of the image where structure is less ambiguous. Then we iteratively deform and expand the tree to fit finer images, using a branching Gaussian process regularizing prior for deformation. Our method infers natural tree topologies despite ambiguous branching and image data containing loops. Our work shows that Gaussian processes can be a powerful building block for modeling complex structure, and they perform well in computer vision problems having significant noise and ambiguity.

Page generated in 0.0773 seconds