Spelling suggestions: "subject:"bayesian"" "subject:"eayesian""
211 |
Bayesian Techniques for Adaptive Acoustic SurveillanceMorton, Kenneth D. January 2010 (has links)
<p>Automated acoustic sensing systems are required to detect, classify and localize acoustic signals in real-time. Despite the fact that humans are capable of performing acoustic sensing tasks with ease in a variety of situations, the performance of current automated acoustic sensing algorithms is limited by seemingly benign changes in environmental or operating conditions. In this work, a framework for acoustic surveillance that is capable of accounting for changing environmental and operational conditions, is developed and analyzed. The algorithms employed in this work utilize non-stationary and nonparametric Bayesian inference techniques to allow the resulting framework to adapt to varying background signals and allow the system to characterize new signals of interest when additional information is available. The performance of each of the two stages of the framework is compared to existing techniques and superior performance of the proposed methodology is demonstrated. The algorithms developed operate on the time-domain acoustic signals in a nonparametric manner, thus enabling them to operate on other types of time-series data without the need to perform application specific tuning. This is demonstrated in this work as the developed models are successfully applied, without alteration, to landmine signatures resulting from ground penetrating radar data. The nonparametric statistical models developed in this work for the characterization of acoustic signals may ultimately be useful not only in acoustic surveillance but also other topics within acoustic sensing.</p> / Dissertation
|
212 |
Performance analysis of snr estimates for awgn and time-selective fading channelsPeksen, Huseyin 15 May 2009 (has links)
In this work, first the Cramer-Rao lower bound (CRLB) of the signal-to-noise
ratio (SNR) estimate for binary phase shift keying (BPSK) modulated signals in
additive white Gaussian noise (AWGN) channels is derived. All the steps and results
of this CRLB derivation are shown in a detailed manner. Two major estimation
scenarios are considered herein: the non-data-aided (NDA) and data-aided (DA)
frameworks, respectively. The non-data-aided scenario does not assume the periodic
transmission of known data symbols (pilots) to limit the system throughput, while
the data-aided scenario assumes the transmission of known transmit data symbols
or training sequences to estimate the channel parameters. The Cramer-Rao lower
bounds for the non-data-aided and data-aided scenarios are derived. In addition, the
modified Cramer-Rao lower bound (MCRLB) is also calculated and compared to the
true CRLBs. It is shown that in the low SNR regime the true CRLB is tighter than
the MCRLB in the non-data-aided estimation scenario.
Second, the Bayesian Cramer-Rao lower bound (BCRLB) for SNR estimate is
considered for BPSK modulated signals in the presence of time-selective fading channels. Only the data-aided scenario is considered, and the time-selective fading channel
is modeled by means of a polynomial function. A BCRLB on the variance of the SNR estimate is found and the simulation results are presented.
|
213 |
The Method of Manufactured Universes for Testing Uncertainty Quantification MethodsStripling, Hayes Franklin 2010 December 1900 (has links)
The Method of Manufactured Universes is presented as a validation framework for
uncertainty quantification (UQ) methodologies and as a tool for exploring the effects
of statistical and modeling assumptions embedded in these methods. The framework
calls for a manufactured reality from which "experimental" data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which
simulation results are created (possibly with numerical error), the application of a
system for quantifying uncertainties in model predictions, and an assessment of how
accurately those uncertainties are quantified. The application presented for this research manufactures a particle-transport "universe," models it using diffusion theory
with uncertain material parameters, and applies both Gaussian process and Bayesian
MARS algorithms to make quantitative predictions about new "experiments" within
the manufactured reality. To test further the responses of these UQ methods, we
conduct exercises with "experimental" replicates, "measurement" error, and choices
of physical inputs that reduce the accuracy of the diffusion model's approximation
of our manufactured laws.
Our first application of MMU was rich in areas for exploration and highly informative. In the case of the Gaussian process code, we found that the fundamental
statistical formulation was not appropriate for our functional data, but that the code
allows a knowledgable user to vary parameters within this formulation to tailor its
behavior for a specific problem. The Bayesian MARS formulation was a more natural emulator given our manufactured laws, and we used the MMU framework to develop
further a calibration method and to characterize the diffusion model discrepancy.
Overall, we conclude that an MMU exercise with a properly designed universe (that
is, one that is an adequate representation of some real-world problem) will provide
the modeler with an added understanding of the interaction between a given UQ
method and his/her more complex problem of interest. The modeler can then apply
this added understanding and make more informed predictive statements.
|
214 |
Bayesian Analysis of Transposon Mutagenesis DataDeJesus, Michael A. 2012 May 1900 (has links)
Determining which genes are essential for growth of a bacterial organism is an important question to answer as it is useful for the discovery of drugs that inhibit critical biological functions of a pathogen. To evaluate essentiality, biologists often use transposon mutagenesis to disrupt genomic regions within an organism, revealing which genes are able to withstand disruption and are therefore not required for growth. The development of next-generation sequencing technology augments transposon mutagenesis by providing high-resolution sequence data that identifies the exact location of transposon insertions in the genome. Although this high-resolution information has already been used to assess essentiality at a genome-wide scale, no formal statistical model has been developed capable of quantifying significance. This thesis presents a formal Bayesian framework for analyzing sequence information obtained from transposon mutagenesis experiments. Our method assesses the statistical significance of gaps in transposon coverage that are indicative of essential regions through a Gumbel distribution, and utilizes a Metropolis-Hastings sampling procedure to obtain posterior estimates of the probability of essentiality for each gene. We apply our method to libraries of M. tuberculosis transposon mutants, to identify genes essential for growth in vitro, and show concordance with previous essentiality results based on hybridization. Furthermore, we show how our method is capable of identifying essential domains within genes, by detecting significant sub-regions of open-reading frames unable to withstand disruption. We show that several genes involved in PG biosynthesis have essential domains.
|
215 |
Using Bayesian Networks for Discovering Temporal-State Transitions in HemodialysisChiu, Chih-Hung 02 August 2000 (has links)
In this thesis, we discover knowledge from workflow logs with temporal-state transitions in the form of Bayesian networks. Bayesian network is a graphical model that encodes probabilistic relationships among variables of interest, and easily incorporates with new instances to maintain rules up to date. The Bayesian networks can predict, communicate, train, and offer more alternatives to make better decisions. We demonstrate the proposed method in representing the causal relationships between medical treatments and transitions of patient¡¦s physiological states in the Hemodialysis process. The discovery of clinical pathway patterns of Hemodialysis can be used for predicting possible paths for an admitted patient, and facilitating medical professionals to control the Hemodialysis machines during the Hemodialysis process. The reciprocal knowledge management can be extended from the results in future research.
|
216 |
An efficient Bayesian formulation for production data integration into reservoir modelsLeonardo, Vega Velasquez 17 February 2005 (has links)
Current techniques for production data integration into reservoir models can be broadly grouped into two categories: deterministic and Bayesian. The deterministic approach relies on imposing parameter smoothness constraints using spatial derivatives to ensure large-scale changes consistent with the low resolution of the production data. The Bayesian approach is based on prior estimates of model statistics such as parameter covariance and data errors and attempts to generate posterior models consistent with the static and dynamic data. Both approaches have been successful for field-scale applications although the computational costs associated with the two methods can vary widely. This is particularly the case for the Bayesian approach that utilizes a prior covariance matrix that can be large and full. To date, no systematic study has been carried out to examine the scaling properties and relative merits of the methods. The main purpose of this work is twofold. First, we systematically investigate the scaling of the computational costs for the deterministic and the Bayesian approaches for realistic field-scale applications. Our results indicate that the deterministic approach exhibits a linear increase in the CPU time with model size compared to a quadratic increase for the Bayesian approach. Second, we propose a fast and robust adaptation of the Bayesian formulation that preserves the statistical foundation of the Bayesian method and at the same time has a scaling property similar to that of the deterministic approach. This can lead to orders of magnitude savings in computation time for model sizes greater than 100,000 grid blocks. We demonstrate the power and utility of our proposed method using synthetic examples and a field example from the Goldsmith field, a carbonate reservoir in west Texas. The use of the new efficient Bayesian formulation along with the Randomized Maximum Likelihood method allows straightforward assessment of uncertainty. The former provides computational efficiency and the latter avoids rejection of expensive conditioned realizations.
|
217 |
Nonlinear bayesian filtering with applications to estimation and navigationLee, Deok-Jin 29 August 2005 (has links)
In principle, general approaches to optimal nonlinear filtering can be described
in a unified way from the recursive Bayesian approach. The central idea to this recur-
sive Bayesian estimation is to determine the probability density function of the state
vector of the nonlinear systems conditioned on the available measurements. However,
the optimal exact solution to this Bayesian filtering problem is intractable since it
requires an infinite dimensional process. For practical nonlinear filtering applications
approximate solutions are required. Recently efficient and accurate approximate non-
linear filters as alternatives to the extended Kalman filter are proposed for recursive
nonlinear estimation of the states and parameters of dynamical systems. First, as
sampling-based nonlinear filters, the sigma point filters, the unscented Kalman fil-
ter and the divided difference filter are investigated. Secondly, a direct numerical
nonlinear filter is introduced where the state conditional probability density is calcu-
lated by applying fast numerical solvers to the Fokker-Planck equation in continuous-
discrete system models. As simulation-based nonlinear filters, a universally effective
algorithm, called the sequential Monte Carlo filter, that recursively utilizes a set of
weighted samples to approximate the distributions of the state variables or param-
eters, is investigated for dealing with nonlinear and non-Gaussian systems. Recentparticle filtering algorithms, which are developed independently in various engineer-
ing fields, are investigated in a unified way. Furthermore, a new type of particle
filter is proposed by integrating the divided difference filter with a particle filtering
framework, leading to the divided difference particle filter. Sub-optimality of the ap-
proximate nonlinear filters due to unknown system uncertainties can be compensated
by using an adaptive filtering method that estimates both the state and system error
statistics. For accurate identification of the time-varying parameters of dynamic sys-
tems, new adaptive nonlinear filters that integrate the presented nonlinear filtering
algorithms with noise estimation algorithms are derived.
For qualitative and quantitative performance analysis among the proposed non-
linear filters, systematic methods for measuring the nonlinearities, biasness, and op-
timality of the proposed nonlinear filters are introduced. The proposed nonlinear
optimal and sub-optimal filtering algorithms with applications to spacecraft orbit es-
timation and autonomous navigation are investigated. Simulation results indicate
that the advantages of the proposed nonlinear filters make these attractive alterna-
tives to the extended Kalman filter.
|
218 |
Productivity prediction model based on Bayesian analysis and productivity consoleYun, Seok Jun 29 August 2005 (has links)
Software project management is one of the most critical activities in modern software
development projects. Without realistic and objective management, the software development
process cannot be managed in an effective way. There are three general
problems in project management: effort estimation is not accurate, actual status is
difficult to understand, and projects are often geographically dispersed. Estimating
software development effort is one of the most challenging problems in project
management. Various attempts have been made to solve the problem; so far, however,
it remains a complex problem. The error rate of a renowned effort estimation
model can be higher than 30% of the actual productivity. Therefore, inaccurate estimation
results in poor planning and defies effective control of time and budgets in
project management. In this research, we have built a productivity prediction model
which uses productivity data from an ongoing project to reevaluate the initial productivity
estimate and provides managers a better productivity estimate for project
management. The actual status of the software project is not easy to understand
due to problems inherent in software project attributes. The project attributes are
dispersed across the various CASE (Computer-Aided Software Engineering) tools and
are difficult to measure because they are not hard material like building blocks. In
this research, we have created a productivity console which incorporates an expert
system to measure project attributes objectively and provides graphical charts to
visualize project status. The productivity console uses project attributes gathered
in KB (Knowledge Base) of PAMPA II (Project Attributes Monitoring and Prediction
Associate) that works with CASE tools and collects project attributes from the
databases of the tools. The productivity console and PAMPA II work on a network, so
geographically dispersed projects can be managed via the Internet without difficulty.
|
219 |
Integration and quantification of uncertainty of volumetric and material balance analyses using a Bayesian frameworkOgele, Chile 01 November 2005 (has links)
Estimating original hydrocarbons in place (OHIP) in a reservoir is fundamentally
important to estimating reserves and potential profitability. Quantifying the uncertainties
in OHIP estimates can improve reservoir development and investment decision-making
for individual reservoirs and can lead to improved portfolio performance. Two
traditional methods for estimating OHIP are volumetric and material balance methods.
Probabilistic estimates of OHIP are commonly generated prior to significant production
from a reservoir by combining volumetric analysis with Monte Carlo methods. Material
balance is routinely used to analyze reservoir performance and estimate OHIP. Although
material balance has uncertainties due to errors in pressure and other parameters,
probabilistic estimates are seldom done.
In this thesis I use a Bayesian formulation to integrate volumetric and material balance
analyses and to quantify uncertainty in the combined OHIP estimates. Specifically, I
apply Bayes?? rule to the Havlena and Odeh material balance equation to estimate
original oil in place, N, and relative gas-cap size, m, for a gas-cap drive oil reservoir. The
paper considers uncertainty and correlation in the volumetric estimates of N and m
(reflected in the prior probability distribution), as well as uncertainty in the pressure data
(reflected in the likelihood distribution). Approximation of the covariance of the
posterior distribution allows quantification of uncertainty in the estimates of N and m
resulting from the combined volumetric and material balance analyses. Several example applications to illustrate the value of this integrated approach are
presented. Material balance data reduce the uncertainty in the volumetric estimate, and
the volumetric data reduce the considerable non-uniqueness of the material balance
solution, resulting in more accurate OHIP estimates than from the separate analyses. One
of the advantages over reservoir simulation is that, with the smaller number of
parameters in this approach, we can easily sample the entire posterior distribution,
resulting in more complete quantification of uncertainty. The approach can also detect
underestimation of uncertainty in either volumetric data or material balance data,
indicated by insufficient overlap of the prior and likelihood distributions. When this
occurs, the volumetric and material balance analyses should be revisited and the
uncertainties of each reevaluated.
|
220 |
Construction Gene Relation Network Using Text Mining and Bayesian NetworkChen, Shu-fen 11 September 2007 (has links)
In the organism, genes don¡¦t work independently. The interaction of genes shows how the functional task affects. Observing the interaction can understand what the relation between genes and how the disease caused. Several methods are adopted to observe the interaction to construct gene relation network. Existing algorithms to construct gene relation network can be classified into two types. One is to use literatures to extract the relation between genes. The other is to construct the network, but the relations between genes are not described. In this thesis, we proposed a hybrid method based on these two methods. Bayesian network is applied to the microarray gene expression data to construct gene network. Text mining is used to extract the gene relations from the documents database. The proposed algorithm integrates gene network and gene relations into gene relation networks. Experimental results show that the related genes are connected in the network. Besides, the relations are also marked on the links of the related genes.
|
Page generated in 0.0292 seconds