• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 856
  • 403
  • 113
  • 89
  • 24
  • 19
  • 13
  • 10
  • 7
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • Tagged with
  • 1886
  • 660
  • 330
  • 234
  • 220
  • 216
  • 212
  • 212
  • 208
  • 204
  • 189
  • 182
  • 169
  • 150
  • 144
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
431

Generative Modelling and Probabilistic Inference of Growth Patterns of Individual Microbes

Nagarajan, Shashi January 2022 (has links)
The fundamental question of how cells maintain their characteristic size remains open. Cell size measurements made through microscopic time-lapse imaging of microfluidic single cell cultivations have posed serious challenges to classical cell growth models and are supporting the development of newer, nuanced models that explain empirical findings better. Yet current models are limited, either to specific types of cells and/or to cell growth under specific microenvironmental conditions. Together with the fact that tools for robust analysis of said time-lapse images are not widely available as yet, the above-mentioned point presents an opportunity to progress the cell growth and size homeostasis discourse through generative, probabilistic modeling and analysis of the utility of different statistical estimation and inference techniques in recovering the parameters of the same. In this thesis, I present a novel Model Framework for simulating microfluidic single-cell cultivations with 36 different simulation modalities, each integrating dominant cell growth theories and generative modelling techniques. I also present a comparative analysis of how different Frequentist and Bayesian probabilistic inference techniques such as Nuisance Variable Elimination and Variational Inference work in the context of a case study of the estimation of a single model describing a microfluidic cell cultivation.
432

Scalable Inference in Latent Gaussian Process Models

Wenzel, Florian 05 February 2020 (has links)
Latente Gauß-Prozess-Modelle (latent Gaussian process models) werden von Wissenschaftlern benutzt, um verborgenen Muster in Daten zu er- kennen, Expertenwissen in probabilistische Modelle einfließen zu lassen und um Vorhersagen über die Zukunft zu treffen. Diese Modelle wurden erfolgreich in vielen Gebieten wie Robotik, Geologie, Genetik und Medizin angewendet. Gauß-Prozesse definieren Verteilungen über Funktionen und können als flexible Bausteine verwendet werden, um aussagekräftige probabilistische Modelle zu entwickeln. Dabei ist die größte Herausforderung, eine geeignete Inferenzmethode zu implementieren. Inferenz in probabilistischen Modellen bedeutet die A-Posteriori-Verteilung der latenten Variablen, gegeben der Daten, zu berechnen. Die meisten interessanten latenten Gauß-Prozess-Modelle haben zurzeit nur begrenzte Anwendungsmöglichkeiten auf großen Datensätzen. In dieser Doktorarbeit stellen wir eine neue effiziente Inferenzmethode für latente Gauß-Prozess-Modelle vor. Unser neuer Ansatz, den wir augmented variational inference nennen, basiert auf der Idee, eine erweiterte (augmented) Version des Gauß-Prozess-Modells zu betrachten, welche bedingt konjugiert (conditionally conjugate) ist. Wir zeigen, dass Inferenz in dem erweiterten Modell effektiver ist und dass alle Schritte des variational inference Algorithmus in geschlossener Form berechnet werden können, was mit früheren Ansätzen nicht möglich war. Unser neues Inferenzkonzept ermöglicht es, neue latente Gauß-Prozess- Modelle zu studieren, die zu innovativen Ergebnissen im Bereich der Sprachmodellierung, genetischen Assoziationsstudien und Quantifizierung der Unsicherheit in Klassifikationsproblemen führen. / Latent Gaussian process (GP) models help scientists to uncover hidden structure in data, express domain knowledge and form predictions about the future. These models have been successfully applied in many domains including robotics, geology, genetics and medicine. A GP defines a distribution over functions and can be used as a flexible building block to develop expressive probabilistic models. The main computational challenge of these models is to make inference about the unobserved latent random variables, that is, computing the posterior distribution given the data. Currently, most interesting Gaussian process models have limited applicability to big data. This thesis develops a new efficient inference approach for latent GP models. Our new inference framework, which we call augmented variational inference, is based on the idea of considering an augmented version of the intractable GP model that renders the model conditionally conjugate. We show that inference in the augmented model is more efficient and, unlike in previous approaches, all updates can be computed in closed form. The ideas around our inference framework facilitate novel latent GP models that lead to new results in language modeling, genetic association studies and uncertainty quantification in classification tasks.
433

Exact Bayesian Inference in Graphical Models : Tree-structured Network Inference and Segmentation / Inférence bayésienne exacte dans les modèles graphiques : inférence de réseaux à structure arborescente et segmentation

Schwaller, Loïc 09 September 2016 (has links)
Cette thèse porte sur l'inférence de réseaux. Le cadre statistique naturel à ce genre de problèmes est celui des modèles graphiques, dans lesquels les relations de dépendance et d'indépendance conditionnelles vérifiées par une distribution multivariée sont représentées à l'aide d'un graphe. Il s'agit alors d'apprendre la structure du modèle à partir d'observations portant sur les sommets. Nous considérons le problème d'un point de vue bayésien. Nous avons également décidé de nous concentrer sur un sous-ensemble de graphes permettant d'effectuer l'inférence de manière exacte et efficace, à savoir celui des arbres couvrants. Il est en effet possible d'intégrer une fonction définie sur les arbres couvrants en un temps cubique par rapport au nombre de variables à la condition que cette fonction factorise selon les arêtes, et ce malgré le cardinal super-exponentiel de cet ensemble. En choisissant les distributions a priori sur la structure et les paramètres du modèle de manière appropriée, il est possible de tirer parti de ce résultat pour l'inférence de modèles graphiques arborescents. Nous proposons un cadre formel complet pour cette approche.Nous nous intéressons également au cas où les observations sont organisées en série temporelle. En faisant l'hypothèse que la structure du modèle graphique latent subit un certain nombre de brusques changements, le but est alors de retrouver le nombre et la position de ces points de rupture. Il s'agit donc d'un problème de segmentation. Sous certaines hypothèses de factorisation, l'exploration exhaustive de l'ensemble des segmentations est permise et, combinée aux résultats sur les arbres couvrants, permet d'obtenir, entre autres, la distribution a posteriori des points de ruptures en un temps polynomial à la fois par rapport au nombre de variables et à la longueur de la série. / In this dissertation we investigate the problem of network inference. The statistical frame- work tailored to this task is that of graphical models, in which the (in)dependence relation- ships satis ed by a multivariate distribution are represented through a graph. We consider the problem from a Bayesian perspective and focus on a subset of graphs making structure inference possible in an exact and e cient manner, namely spanning trees. Indeed, the integration of a function de ned on spanning trees can be performed with cubic complexity with respect to number of variables under some factorisation assumption on the edges, in spite of the super-exponential cardinality of this set. A careful choice of prior distributions on both graphs and distribution parameters allows to use this result for network inference in tree-structured graphical models, for which we provide a complete and formal framework.We also consider the situation in which observations are organised in a multivariate time- series. We assume that the underlying graph describing the dependence structure of the distribution is a ected by an unknown number of abrupt changes throughout time. Our goal is then to retrieve the number and locations of these change-points, therefore dealing with a segmentation problem. Using spanning trees and assuming that segments are inde- pendent from one another, we show that this can be achieved with polynomial complexity with respect to both the number of variables and the length of the series.
434

Bayesovský přístup k určování akustických jednotek v řeči / Discovering Acoustic Units from Speech: a Bayesian Approach

Ondel, Lucas Antoine Francois Unknown Date (has links)
Děti mají již od útlého věku vrozenou schopnost vyvozovat jazykové znalosti z mluvené řeči - dlouho předtím, než se naučí číst a psát. Moderní systémy pro rozpoznávání řeči oproti tomu potřebují k dosažení nízké chybovosti značná množství přepsaných řečových dat. Teprve nedávno založená vědecká oblast "učení řeči bez supervize" se věnuje přenosu popsaných lidských schopností do strojového učení. V rámci této oblasti se naše práce zaměřuje na problém určení sady akustických jednotek z jazyka, kde jsou k disposici pouze nepřepsané zvukové nahrávky. Pro řešení tohoto problému zkoumáme zejména potenciál bayesovské inference. V práci nejprve pro úlohu určování akustických jednotek revidujeme využití state-of-the-art neparametrického bayesovského modelu, pro který jsme odvodili rychlý a efektivní algoritmus variační bayesovské inference. Náš přístup se opírá o konstrukci Dirichletova procesu pomocí "lámání hůlky" (stick breaking) umožňující vyjádření modelu jako fonémové smyčky založené na skrytém Markovově modelu. S tímto modelem a vhodnou středopolní (mean-field) aproximací variační posteriorní pravděpodobnosti je inference realizována pomocí efektivního iteračního algoritmu, podobného známému schématu Expectation-Maximization (EM). Experimenty ukazují, že tento přístup zajišťuje lepší shlukování než původní model, přičemž je řádově rychlejší. Druhým přínosem práce je řešení problému definice smysluplného apriorního rozdělení na potenciální akustické jednotky. Za tímto účelem představujeme zobecněný podprostorový model (Generalized Subspace Model) - teoretický rámec umožňující definovat pravděpodobnostní rozdělení v nízkodimenzionálních nadplochách (manifoldech) ve vysokorozměrném prostoru parametrů. Pomocí tohoto nástroje učíme fonetický podprostor - kontinuum vektorových reprezentací (embeddingů) fonémů - z několika jazyků s přepsanými nahrávkami. Pak je tento fonetický podprostor použit k omezení našeho systému tak, aby určené akustické jednotky byly podobné fonémům z ostatních jazyků. Experimentální výsledky ukazují,že tento přístup významně zlepšuje kvalitu shlukování i přesnost segmentace systému pro určování akustických jednotek.
435

Bayesian Identification of Nonlinear Structural Systems: Innovations to Address Practical Uncertainty

Alana K Lund (10702392) 26 April 2021 (has links)
The ability to rapidly assess the condition of a structure in a manner which enables the accurate prediction of its remaining capacity has long been viewed as a crucial step in allowing communities to make safe and efficient use of their public infrastructure. This objective has become even more relevant in recent years as both the interdependency and state of deterioration in infrastructure systems throughout the world have increased. Current practice for structural condition assessment emphasizes visual inspection, in which trained professionals will routinely survey a structure to estimate its remaining capacity. Though these methods have the ability to monitor gross structural changes, their ability to rapidly and cost-effectively assess the detailed condition of the structure with respect to its future behavior is limited.<div>Vibration-based monitoring techniques offer a promising alternative to this approach. As opposed to visually observing the surface of the structure, these methods judge its condition and infer its future performance by generating and updating models calibrated to its dynamic behavior. Bayesian inference approaches are particularly well suited to this model updating problem as they are able to identify the structure using sparse observations while simultaneously assessing the uncertainty in the identified parameters. However, a lack of consensus on efficient methods for their implementation to full-scale structural systems has led to a diverse set of Bayesian approaches, from which no clear method can be selected for full-scale implementation. The objective of this work is therefore to assess and enhance those techniques currently used for structural identification and make strides toward developing unified strategies for robustly implementing them on full-scale structures. This is accomplished by addressing several key research questions regarding the ability of these methods to overcome issues in identifiability, sensitivity to uncertain experimental conditions, and scalability. These questions are investigated by applying novel adaptations of several prominent Bayesian identification strategies to small-scale experimental systems equipped with nonlinear devices. Through these illustrative examples I explore the robustness and practicality of these algorithms, while also considering their extensibility to higher-dimensional systems. Addressing these core concerns underlying full-scale structural identification will enable the practical application of Bayesian inference techniques and thereby enhance the ability of communities to detect and respond to the condition of their infrastructure.<br></div>
436

Investigating The Relationship Between Adverse Events And Infrastructure Development In An Active War Theater Using Soft Computing Techniques

Cakit, Erman 01 January 2013 (has links)
The military recently recognized the importance of taking sociocultural factors into consideration. Therefore, Human Social Culture Behavior (HSCB) modeling has been getting much attention in current and future operational requirements to successfully understand the effects of social and cultural factors on human behavior. There are different kinds of modeling approaches to the data that are being used in this field and so far none of them has been widely accepted. HSCB modeling needs the capability to represent complex, ill-defined, and imprecise concepts, and soft computing modeling can deal with these concepts. There is currently no study on the use of any computational methodology for representing the relationship between adverse events and infrastructure development investments in an active war theater. This study investigates the relationship between adverse events and infrastructure development projects in an active war theater using soft computing techniques including fuzzy inference systems (FIS), artificial neural networks (ANNs), and adaptive neuro-fuzzy inference systems (ANFIS) that directly benefits from their accuracy in prediction applications. Fourteen developmental and economic improvement project types were selected based on allocated budget values and a number of projects at different time periods, urban and rural population density, and total adverse event numbers at previous month selected as independent variables. A total of four outputs reflecting the adverse events in terms of the number of people killed, wounded, hijacked, and total number of adverse events has been estimated. For each model, the data was grouped for training and testing as follows: years between 2004 and 2009 (for training purpose) and year 2010 (for testing). Ninety-six different models were developed and investigated for Afghanistan iv and the country was divided into seven regions for analysis purposes. Performance of each model was investigated and compared to all other models with the calculated mean absolute error (MAE) values and the prediction accuracy within ±1 error range (difference between actual and predicted value). Furthermore, sensitivity analysis was performed to determine the effects of input values on dependent variables and to rank the top ten input parameters in order of importance. According to the the results obtained, it was concluded that the ANNs, FIS, and ANFIS are useful modeling techniques for predicting the number of adverse events based on historical development or economic projects’ data. When the model accuracy was calculated based on the MAE for each of the models, the ANN had better predictive accuracy than FIS and ANFIS models in general as demonstrated by experimental results. The percentages of prediction accuracy with values found within ±1 error range around 90%. The sensitivity analysis results show that the importance of economic development projects varies based on the regions, population density, and occurrence of adverse events in Afghanistan. For the purpose of allocating resources and development of regions, the results can be summarized by examining the relationship between adverse events and infrastructure development in an active war theater; emphasis was on predicting the occurrence of events and assessing the potential impact of regional infrastructure development efforts on reducing number of such events.
437

Exact likelihood inference for multiple exponential populations under joint censoring

Su, Feng 04 1900 (has links)
<p>The joint censoring scheme is of practical significance while conducting comparative life-tests of products from different units within the same facility. In this thesis, we derive the exact distributions of the maximum likelihood estimators (MLEs) of the unknown parameters when joint censoring of some form is present among the multiple samples, and then discuss the construction of exact confidence intervals for the parameters.</p> <p>We develop inferential methods based on four different joint censoring schemes. The first one is when a jointly Type-II censored sample arising from $k$ independent exponential populations is available. The second one is when a jointly progressively Type-II censored sample is available, while the last two cases correspond to jointly Type-I hybrid censored and jointly Type-II hybrid censored samples. For each one of these cases, we derive the conditional MLEs of the $k$ exponential mean parameters, and derive their conditional moment generating functions and exact densities, using which we then develop exact confidence intervals for the $k$ population parameters. Furthermore, approximate confidence intervals based on the asymptotic normality of the MLEs, parametric bootstrap intervals, and credible confidence regions from a Bayesian viewpoint are all discussed. An empirical evaluation of all these methods of confidence intervals is also made in terms of coverage probabilities and average widths. Finally, we present examples in order to illustrate all the methods of inference developed here for different joint censoring scenarios.</p> / Doctor of Science (PhD)
438

Revealing human sensitivity to a latent temporal structure of changes

Marković, Dimitrije, Reiter, Andrea M. F., Kiebel, Stefan J. 22 May 2024 (has links)
Precisely timed behavior and accurate time perception plays a critical role in our everyday lives, as our wellbeing and even survival can depend on well-timed decisions. Although the temporal structure of the world around us is essential for human decision making, we know surprisingly little about how representation of temporal structure of our everyday environment impacts decision making. How does the representation of temporal structure affect our ability to generate well-timed decisions? Here we address this question by using a well-established dynamic probabilistic learning task. Using computational modeling, we found that human subjects' beliefs about temporal structure are reflected in their choices to either exploit their current knowledge or to explore novel options. The model-based analysis illustrates a large within-group and within-subject heterogeneity. To explain these results, we propose a normative model for how temporal structure is used in decision making, based on the semi-Markov formalism in the active inference framework. We discuss potential key applications of the presented approach to the fields of cognitive phenotyping and computational psychiatry.
439

Treatment heterogeneity and potential outcomes in linear mixed effects models

Richardson, Troy E. January 1900 (has links)
Doctor of Philosophy / Department of Statistics / Gary L. Gadbury / Studies commonly focus on estimating a mean treatment effect in a population. However, in some applications the variability of treatment effects across individual units may help to characterize the overall effect of a treatment across the population. Consider a set of treatments, {T,C}, where T denotes some treatment that might be applied to an experimental unit and C denotes a control. For each of N experimental units, the duplet {r[subscript]i, r[subscript]Ci}, i=1,2,…,N, represents the potential response of the i[superscript]th experimental unit if treatment were applied and the response of the experimental unit if control were applied, respectively. The causal effect of T compared to C is the difference between the two potential responses, r[subscript]Ti- r[subscript]Ci. Much work has been done to elucidate the statistical properties of a causal effect, given a set of particular assumptions. Gadbury and others have reported on this for some simple designs and primarily focused on finite population randomization based inference. When designs become more complicated, the randomization based approach becomes increasingly difficult. Since linear mixed effects models are particularly useful for modeling data from complex designs, their role in modeling treatment heterogeneity is investigated. It is shown that an individual treatment effect can be conceptualized as a linear combination of fixed treatment effects and random effects. The random effects are assumed to have variance components specified in a mixed effects “potential outcomes” model when both potential outcomes, r[subscript]T,r[subscript]C, are variables in the model. The variance of the individual causal effect is used to quantify treatment heterogeneity. Post treatment assignment, however, only one of the two potential outcomes is observable for a unit. It is then shown that the variance component for treatment heterogeneity becomes non-estimable in an analysis of observed data. Furthermore, estimable variance components in the observed data model are demonstrated to arise from linear combinations of the non-estimable variance components in the potential outcomes model. Mixed effects models are considered in context of a particular design in an effort to illuminate the loss of information incurred when moving from a potential outcomes framework to an observed data analysis.
440

Bayesian inference and wavelet methods in image processing

Silwal, Sharad Deep January 1900 (has links)
Master of Science / Department of Statistics / Diego M. Maldonado / Haiyan Wang / This report addresses some mathematical and statistical techniques of image processing and their computational implementation. Fundamental theories have been presented, applied and illustrated with examples. To make the report as self-contained as possible, key terminologies have been defined and some classical results and theorems are stated, in the most part, without proof. Some algorithms and techniques of image processing have been described and substantiated with experimentation using MATLAB. Several ways of estimating original images from noisy image data and their corresponding risks are discussed. Two image processing concepts selected to illustrate computational implementation are: "Bayes classification" and "Wavelet denoising". The discussion of the latter involves introducing a specialized area of mathematics, namely, wavelets. A self-contained theory for wavelets is built by first reviewing basic concepts of Fourier Analysis and then introducing Multi-resolution Analysis and wavelets. For a better understanding of Fourier Analysis techniques in image processing, original solutions to some problems in Fourier Analysis have been worked out. Finally, implementation of the above-mentioned concepts are illustrated with examples and MATLAB codes.

Page generated in 0.0704 seconds