• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2047
  • 601
  • 262
  • 260
  • 61
  • 32
  • 26
  • 19
  • 15
  • 14
  • 10
  • 8
  • 6
  • 6
  • 5
  • Tagged with
  • 4146
  • 815
  • 761
  • 732
  • 723
  • 722
  • 714
  • 661
  • 580
  • 451
  • 433
  • 416
  • 412
  • 370
  • 315
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
721

Strongly coupled Bayesian models for interacting object and scene classification processes

Ehtiati, Tina. January 2007 (has links)
In this thesis, we present a strongly coupled data fusion architecture within a Bayesian framework for modeling the bi-directional influences between the scene and object classification mechanisms. A number of psychophysical studies provide experimental evidence that the object and the scene perception mechanisms are not functionally separate in the human visual system. Object recognition facilitates the recognition of the scene background and also knowledge of the scene context facilitates the recognition of the individual objects in the scene. The evidence indicating a bi-directional exchange between the two processes has motivated us to build a computational model where object and scene classification proceed in an interdependent manner, while no hierarchical relationship is imposed between the two processes. We propose a strongly coupled data fusion model for implementing the feedback relationship between the scene and object classification processes. We present novel schemes for modifying the Bayesian solutions for the scene and object classification tasks which allow data fusion between the two modules based on the constraining of the priors or the likelihoods. We have implemented and tested the two proposed models using a database of natural images created for this purpose. The Receiver Operator Curves (ROC) depicting the scene classification performance of the likelihood coupling and the prior coupling models show that scene classification performance improves significantly in both models as a result of the strong coupling of the scene and object modules. / ROC curves depicting the scene classification performance of the two models also show that the likelihood coupling model achieves a higher detection rate compared to the prior coupling model. We have also computed the average rise times of the models' outputs as a measure of comparing the speed of the two models. The results show that the likelihood coupling model outputs have a shorter rise time. Based on these experimental findings one can conclude that imposing constraints on the likelihood models provides better solutions to the scene classification problems compared to imposing constraints on the prior models. / We have also proposed an attentional feature modulation scheme, which consists of tuning the input image responses to the bank of Gabor filters based on the scene class probabilities estimated by the model and the energy profiles of the Gabor filters for different scene categories. Experimental results based on combining the attentional feature tuning scheme with the likelihood coupling and the prior coupling methods show a significant improvement in the scene classification performances of both models.
722

Bayesian analysis of cosmological models.

Moodley, Darell. January 2010 (has links)
In this thesis, we utilise the framework of Bayesian statistics to discriminate between models of the cosmological mass function. We first review the cosmological model and the formation and distribution of galaxy clusters before formulating a statistic within the Bayesian framework, namely the Bayesian razor, that allows model testing of probability distributions. The Bayesian razor is used to discriminate between three popular mass functions, namely the Press-Schechter, Sheth-Tormen and normalisable Tinker models. With a small number of particles in the simulation, we find that the simpler model is preferred due to the Occam’s razor effect, but as the size of the simulation increases the more complex model, if taken to be the true model, is preferred. We establish criteria on the size of the simulation that is required to decisively favour a given model and investigate the dependence of the simulation size on the threshold mass for clusters, and prior probability distributions. Finally we outline how our method can be extended to consider more realistic N-body simulations or be applied to observational data. / Thesis (M.Sc.)-University of KwaZulu-Natal, Westville, 2010.
723

Complying with norms : a neurocomputational exploration

Colombo, Matteo January 2012 (has links)
The subject matter of this thesis can be summarized by a triplet of questions and answers. Showing what these questions and answers mean is, in essence, the goal of my project. The triplet goes like this: Q: How can we make progress in our understanding of social norms and norm compliance? A: Adopting a neurocomputational framework is one effective way to make progress in our understanding of social norms and norm compliance. Q: What could the neurocomputational mechanism of social norm compliance be? A: The mechanism of norm compliance probably consists of Bayesian - Reinforcement Learning algorithms implemented by activity in certain neural populations. Q: What could information about this mechanism tell us about social norms and social norm compliance? A: Information about this mechanism tells us that: a1: Social norms are uncertainty-minimizing devices. a2: Social norm compliance is one trick that agents employ to interact coadaptively and smoothly in their social environment. Most of the existing treatments of norms and norm compliance (e.g. Bicchieri 2006; Binmore 1993; Elster 1989; Gintis 2010; Lewis 1969; Pettit 1990; Sugden 1986; Ullmann‐Margalit 1977) consist in what Cristina Bicchieri (2006) refers to as “rational reconstructions.” A rational reconstruction of the concept of social norm “specifies in which sense one may say that norms are rational, or compliance with a norm is rational” (Ibid., pp. 10-11). What sets my project apart from these types of treatments is that it aims, first and foremost, at providing a description of some core aspects of the mechanism of norm compliance. The single most original idea put forth in my project is to bring an alternative explanatory framework to bear on social norm compliance. This is the framework of computational cognitive neuroscience. The chapters of this thesis describe some ways in which central issues concerning social norms can be fruitfully addressed within a neurocomputational framework. In order to qualify and articulate the triplet above, my strategy consists firstly in laying down the beginnings of a model of the mechanism of norm compliance behaviour, and then zooming in on specific aspects of the model. Such a model, the chapters of this thesis argue, explains apparently important features of the psychology and neuroscience of norm compliance, and helps us to understand the nature of the social norms we live by.
724

Development and Calibration of Reaction Models for Multilayered Nanocomposites

Vohra, Manav January 2015 (has links)
<p>This dissertation focuses on the development and calibration of reaction models for multilayered nanocomposites. The nanocomposites comprise sputter deposited alternating layers of distinct metallic elements. Specifically, we focus on the equimolar Ni-Al and Zr-Al multilayered systems. Computational models are developed to capture the transient reaction phenomena as well as understand the dependence of reaction properties on the microstructure, composition and geometry of the multilayers. Together with the available experimental data, simulations are used to calibrate the models and enhance the accuracy of their predictions.</p><p>Recent modeling efforts for the Ni-Al system have investigated the nature of self-propagating reactions in the multilayers. Model fidelity was enhanced by incorporating melting effects due to aluminum [Besnoin et al. (2002)]. Salloum and Knio formulated a reduced model to mitigate computational costs associated with multi-dimensional reaction simulations [Salloum and Knio (2010a)]. However, exist- ing formulations relied on a single Arrhenius correlation for diffusivity, estimated for the self-propagating reactions, and cannot be used to quantify mixing rates at lower temperatures within reasonable accuracy [Fritz (2011)]. We thus develop a thermal model for a multilayer stack comprising a reactive Ni-Al bilayer (nanocalorimeter) and exploit temperature evolution measurements to calibrate the diffusion parameters associated with solid state mixing (720 K - 860 K) in the bilayer.</p><p> </p><p>The equimolar Zr-Al multilayered system when reacted aerobically is shown to </p><p>exhibit slow aerobic oxidation of zirconium (in the intermetallic), sustained for about 2-10 seconds after completion of the formation reaction. In a collaborative effort, we aim to exploit the sustained heat release for bio-agent defeat application. A simplified computational model is developed to capture the extended reaction regime characterized by oxidation of Zr-Al multilayers. Simulations provide insight into the growth phenomenon for the zirconia layer during the oxidation process. It is observed that the growth of zirconia is predominantly governed by surface-reaction. However, once the layer thickens, the growth is controlled by the diffusion of oxygen in zirconia.</p><p>A computational model is developed for formation reactions in Zr-Al multilayers. We estimate Arrhenius diffusivity correlations for a low temperature mixing regime characterized by homogeneous ignition in the multilayers, and a high temperature mixing regime characterized by self-propagating reactions in the multilayers. Experimental measurements for temperature and reaction velocity are used for this purpose. Diffusivity estimates for the two regimes are first inferred using regression analysis and full posterior distributions are then estimated for the diffusion parameters using Bayesian statistical approaches. A tight bound on posteriors is observed in the ignition regime whereas estimates for the self-propagating regime exhibit large levels of uncertainty. We further discuss a framework for optimal design of experiments to assess and optimize the utility of a set of experimental measurements for application to reaction models.</p> / Dissertation
725

Searching Genome-wide Disease Association Through SNP Data

Guo, Xuan 11 August 2015 (has links)
Taking the advantage of the high-throughput Single Nucleotide Polymorphism (SNP) genotyping technology, Genome-Wide Association Studies (GWASs) are regarded holding promise for unravelling complex relationships between genotype and phenotype. GWASs aim to identify genetic variants associated with disease by assaying and analyzing hundreds of thousands of SNPs. Traditional single-locus-based and two-locus-based methods have been standardized and led to many interesting findings. Recently, a substantial number of GWASs indicate that, for most disorders, joint genetic effects (epistatic interaction) across the whole genome are broadly existing in complex traits. At present, identifying high-order epistatic interactions from GWASs is computationally and methodologically challenging. My dissertation research focuses on the problem of searching genome-wide association with considering three frequently encountered scenarios, i.e. one case one control, multi-cases multi-controls, and Linkage Disequilibrium (LD) block structure. For the first scenario, we present a simple and fast method, named DCHE, using dynamic clustering. Also, we design two methods, a Bayesian inference based method and a heuristic method, to detect genome-wide multi-locus epistatic interactions on multiple diseases. For the last scenario, we propose a block-based Bayesian approach to model the LD and conditional disease association simultaneously. Experimental results on both synthetic and real GWAS datasets show that the proposed methods improve the detection accuracy of disease-specific associations and lessen the computational cost compared with current popular methods.
726

Machine learning approach to reconstructing signalling pathways and interaction networks in biology

Dondelinger, Frank January 2013 (has links)
In this doctoral thesis, I present my research into applying machine learning techniques for reconstructing species interaction networks in ecology, reconstructing molecular signalling pathways and gene regulatory networks in systems biology, and inferring parameters in ordinary differential equation (ODE) models of signalling pathways. Together, the methods I have developed for these applications demonstrate the usefulness of machine learning for reconstructing networks and inferring network parameters from data. The thesis consists of three parts. The first part is a detailed comparison of applying static Bayesian networks, relevance vector machines, and linear regression with L1 regularisation (LASSO) to the problem of reconstructing species interaction networks from species absence/presence data in ecology (Faisal et al., 2010). I describe how I generated data from a stochastic population model to test the different methods and how the simulation study led us to introduce spatial autocorrelation as an important covariate. I also show how we used the results of the simulation study to apply the methods to presence/absence data of bird species from the European Bird Atlas. The second part of the thesis describes a time-varying, non-homogeneous dynamic Bayesian network model for reconstructing signalling pathways and gene regulatory networks, based on L`ebre et al. (2010). I show how my work has extended this model to incorporate different types of hierarchical Bayesian information sharing priors and different coupling strategies among nodes in the network. The introduction of these priors reduces the inference uncertainty by putting a penalty on the number of structure changes among network segments separated by inferred changepoints (Dondelinger et al., 2010; Husmeier et al., 2010; Dondelinger et al., 2012b). Using both synthetic and real data, I demonstrate that using information sharing priors leads to a better reconstruction accuracy of the underlying gene regulatory networks, and I compare the different priors and coupling strategies. I show the results of applying the model to gene expression datasets from Drosophila melanogaster and Arabidopsis thaliana, as well as to a synthetic biology gene expression dataset from Saccharomyces cerevisiae. In each case, the underlying network is time-varying; for Drosophila melanogaster, as a consequence of measuring gene expression during different developmental stages; for Arabidopsis thaliana, as a consequence of measuring gene expression for circadian clock genes under different conditions; and for the synthetic biology dataset, as a consequence of changing the growth environment. I show that in addition to inferring sensible network structures, the model also successfully predicts the locations of changepoints. The third and final part of this thesis is concerned with parameter inference in ODE models of biological systems. This problem is of interest to systems biology researchers, as kinetic reaction parameters can often not be measured, or can only be estimated imprecisely from experimental data. Due to the cost of numerically solving the ODE system after each parameter adaptation, this is a computationally challenging problem. Gradient matching techniques circumvent this problem by directly fitting the derivatives of the ODE to the slope of an interpolant. I present an inference procedure for a model using nonparametric Bayesian statistics with Gaussian processes, based on Calderhead et al. (2008). I show that the new inference procedure improves on the original formulation in Calderhead et al. (2008) and I present the result of applying it to ODE models of predator-prey interactions, a circadian clock gene, a signal transduction pathway, and the JAK/STAT pathway.
727

A comparison of Bayesian variable selection approaches for linear models

Rahman, Husneara 03 May 2014 (has links)
Bayesian variable selection approaches are more powerful in discriminating among models regardless of whether these models under investigation are hierarchical or not. Although Bayesian approaches require complex computation, use of theMarkov Chain Monte Carlo (MCMC) methods, such as, Gibbs sampler and Metropolis-Hastings algorithm make computations easier. In this study we investigated the e↵ectiveness of Bayesian variable selection approaches in comparison to other non-Bayesian or classical approaches. For this purpose, we compared the performance of Bayesian versus non-Bayesian variable selection approaches for linear models. Among these approaches, we studied Conditional Predictive Ordinate (CPO) and Bayes factor. Among the non-Bayesian or classical approaches, we implemented adjusted R-square, Akaike Information Criterion (AIC) and Bayes Information Criterion (BIC) for model selection. We performed a simulation study to examine how Bayesian and non- Bayesian approaches perform in selecting variables. We also applied these methods to real data and compared their performances. We observed that for linear models, Bayesian variable selection approaches perform consistently as that of non-Bayesian approaches. / Bayesian inference -- Bayesian inference for normally distributed likekilhood -- Model adequacy -- Simulation approach -- Application to wage data. / Department of Mathematical Sciences
728

Technical Due Diligence Assessment and Bayesian Belief Networks Methodology for Wind Power Projects

Das, Bibhash January 2013 (has links)
A Technical Due Diligence (TDD) investigation is an important step in the process of obtaining financing, or in mergers and acquisitions, for a wind power project. The investigation, the scope of which varies depending on the stage and nature of the project, involves reviewing important documentation relating to different aspects of the project, assessing potential risks in terms of the quality of the information available and suggesting mitigation or other risk management measures where required. A TDD assessment can greatly benefit from increased objectivity in terms of the reviewed aspects as it enables a sharper focus on the important risk elements and also provides a better appreciation of the investigated parameters. This master’s thesis has been an attempt to introduce more objectivity in the technical due diligence process followed at the host company. Thereafter, a points-based scoring system was devised to quantify the answered questions. The different aspects under investigation have a complex interrelationship and the resulting risks can be viewed as an outcome of a causal framework. To identify this causal framework the concept of Bayesian Belief Networks has been assessed. The resulting Bayesian Networks can be considered to provide a holistic framework for risk analysis within the TDD assessment process. The importance of accurate analysis of likelihood information for accurate analysis of Bayesian analysis has been identified. The statistical data set for the right framework needs to be generated to have the right correct setting for Bayesian analysis in the future studies. The objectiveness of the TDD process can be further enhanced by taking into consideration the capability of the investing body to handle the identified risks and also benchmarking risky aspects with industry standards or historical precedence.
729

Some problems in Bayesian group decisions

Yen, Peng-Fang January 1992 (has links)
One employs the mathematical analysis of decision making when the state of nature is uncertain but further information about it can be obtained by experimentation. Bayesian Decision Theory concerns practical problems of decision making under conditions of uncertainty and also requires the use of statistical and mathematical methods.In this thesis, some basic risk sharing and group decision concepts are provided. Risk is the expected value of the Loss Function of Bayesian Estimators. Group decisions consider situations in which the individuals need to agree both on utilities for consequences and on conditional probability assessments for different experimental outcomes. / Department of Mathematical Sciences
730

Bayesian Contact Tracing for Communicable Respiratory Diseases

Shalaby, Ayman 02 January 2014 (has links)
Purpose: The purpose of our work is to develop a system for automatic contact tracing with the goal of identifying individuals who are most likely infected, even if we do not have direct diagnostic information on their health status. Control of the spread of respiratory pathogens (e.g. novel influenza viruses) in the population using vaccination is a challenging problem that requires quick identification of the infectious agent followed by large-scale production and administration of a vaccine. This takes a significant amount of time. A complementary approach to control transmission is contact tracing and quarantining, which are currently applied to sexually transmitted diseases (STDs). For STDs, identifying the contacts that might have led to disease transmission is relatively easy; however, for respiratory pathogens, the contacts that can lead to transmission include a huge number of face-to-face daily social interactions that are impossible to trace manually. Method: We developed a Bayesian network model to process context awareness proximity sensor information together with (possibly incomplete) diagnosis information to track the spread of disease in a population. Our model tracks real-time proximity contacts and can provide public health agencies with the probability of infection for each individual in the model. For testing our algorithm, we used a real-world mobile sensor dataset of 80 individuals, and we simulated an outbreak. Result: We ran several experiments where different sub-populations were ???infected??? and ???diagnosed.??? By using the contact information, our model was able to automatically identify individuals in the population who were likely to be infected even though they were not directly ???diagnosed??? with an illness. Conclusion: Automatic contact tracing for respiratory pathogens is a powerful idea, however we have identified several implementation challenges. The first challenge is scalability: we note that a contact tracing system with a hundred thousand individuals requires a Bayesian model with a billion nodes. Bayesian inference on models of this scale is an open problem and an active area of research. The second challenge is privacy protection: although the test data were collected in an academic setting, deploying any system will require appropriate safeguards for user privacy. Nonetheless, our work illustrates the potential for broader use of contact tracing for modelling and controlling disease transmission.

Page generated in 0.068 seconds