Spelling suggestions: "subject:"inference."" "subject:"cnference.""
1 |
Sequential conditional probability ratio tests in the multi-parameter exponential familyLiu, Shin Ta, January 1976 (has links)
Thesis--Wisconsin. / Vita. Includes bibliographical references (leaves 106-111).
|
2 |
From genomes to post-processing of Bayesian inference of phylogenyAli, Raja Hashim January 2016 (has links)
Life is extremely complex and amazingly diverse; it has taken billions of years of evolution to attain the level of complexity we observe in nature now and ranges from single-celled prokaryotes to multi-cellular human beings. With availability of molecular sequence data, algorithms inferring homology and gene families have emerged and similarity in gene content between two genes has been the major signal utilized for homology inference. Recently there has been a significant rise in number of species with fully sequenced genome, which provides an opportunity to investigate and infer homologs with greater accuracy and in a more informed way. Phylogeny analysis explains the relationship between member genes of a gene family in a simple, graphical and plausible way using a tree representation. Bayesian phylogenetic inference is a probabilistic method used to infer gene phylogenies and posteriors of other evolutionary parameters. Markov chain Monte Carlo (MCMC) algorithm, in particular using Metropolis-Hastings sampling scheme, is the most commonly employed algorithm to determine evolutionary history of genes. There are many softwares available that process results from each MCMC run, and explore the parameter posterior but there is a need for interactive software that can analyse both discrete and real-valued parameters, and which has convergence assessment and burnin estimation diagnostics specifically designed for Bayesian phylogenetic inference. In this thesis, a synteny-aware approach for gene homology inference, called GenFamClust (GFC), is proposed that uses gene content and gene order conservation to infer homology. The feature which distinguishes GFC from earlier homology inference methods is that local synteny has been combined with gene similarity to infer homologs, without inferring homologous regions. GFC was validated for accuracy on a simulated dataset. Gene families were computed by applying clustering algorithms on homologs inferred from GFC, and compared for accuracy, dependence and similarity with gene families inferred from other popular gene family inference methods on a eukaryotic dataset. Gene families in fungi obtained from GFC were evaluated against pillars from Yeast Gene Order Browser. Genome-wide gene families for some eukaryotic species are computed using this approach. Another topic focused in this thesis is the processing of MCMC traces for Bayesian phylogenetics inference. We introduce a new software VMCMC which simplifies post-processing of MCMC traces. VMCMC can be used both as a GUI-based application and as a convenient command-line tool. VMCMC supports interactive exploration, is suitable for automated pipelines and can handle both real-valued and discrete parameters observed in a MCMC trace. We propose and implement joint burnin estimators that are specifically applicable to Bayesian phylogenetics inference. These methods have been compared for similarity with some other popular convergence diagnostics. We show that Bayesian phylogenetic inference and VMCMC can be applied to infer valuable evolutionary information for a biological case – the evolutionary history of FERM domain. / <p>QC 20160201</p>
|
3 |
Comparison of Sampling-Based Algorithms for Multisensor Distributed Target TrackingNguyen, Trang 16 May 2003 (has links)
Nonlinear filtering is certainly very important in estimation since most real-world problems are nonlinear. Recently a considerable progress in the nonlinear filtering theory has been made in the area of the sampling-based methods, including both random (Monte Carlo) and deterministic (quasi-Monte Carlo) sampling, and their combination. This work considers the problem of tracking a maneuvering target in a multisensor environment. A novel scheme for distributed tracking is employed that utilizes a nonlinear target model and estimates from local (sensor-based) estimators. The resulting estimation problem is highly nonlinear and thus quite challenging. In order to evaluate the performance capabilities of the architecture considered, advanced sampling-based nonlinear filters are implemented: particle filter (PF), unscented Kalman filter (UKF), and unscented particle filter (UPF). Results from extensive Monte Carlo simulations using different configurations of these algorithms are obtained to compare their effectiveness for solving the distributed target tracking problem.
|
4 |
Efficient inference algorithms for network activitiesTran, Long Quoc 08 June 2015 (has links)
The real social network and associated communities are often hidden under the declared friend or group lists in social networks. We usually observe the manifestation of these hidden networks and communities in the form of recurrent and time-stamped individuals' activities in the social network. The inference of relationship between users/nodes or groups of users/nodes could be further complicated when activities are interval-censored, that is, when one only observed the number of activities that occurred in certain time windows. The same phenomenon happens in the online advertisement world where the advertisers often offer a set of advertisement impressions and observe a set of conversions (i.e. product/service adoption). In this case, the advertisers desire to know which advertisements best appeal to the customers and most importantly, their rate of conversions.
Inspired by these challenges, we investigated inference algorithms that efficiently recover user relationships in both cases: time-stamped data and interval-censored data. In case of time-stamped data, we proposed a novel algorithm called NetCodec, which relies on a Hawkes process that models the intertwine relationship between group participation and between-user influence. Using Bayesian variational principle and optimization techniques, NetCodec could infer both group participation and user influence simultaneously with iteration complexity being O((N+I)G), where N is the number of events, I is the number of users, and G is the number of groups. In case of interval-censored data, we proposed a Monte-Carlo EM inference algorithm where we iteratively impute the time-stamped events using a Poisson process that has intensity function approximates the underlying intensity function. We show that that proposed simulated approach delivers better inference performance than baseline methods.
In the advertisement problem, we propose a Click-to-Conversion delay model that uses Hawkes processes to model the advertisement impressions and thinned Poisson processes to model the Click-to-Conversion mechanism. We then derive an efficient Maximum Likelihood Estimator which utilizes the Minorization-Maximization framework. We verify the model against real life online advertisement logs in comparison with recent conversion rate estimation methods.
To facilitate reproducible research, we also developed an open-source software package that focuses on various Hawkes processes proposed in the above mentioned works and prior works. We provided efficient parallel (multi-core) implementations of the inference algorithms using the Bayesian variational inference framework. To further speed up these inference algorithms, we also explored distributed optimization techniques for convex optimization under the distributed data situation. We formulate this problem as a consensus-constrained optimization problem and solve it with the alternating direction method for multipliers (ADMM). It turns out that using bipartite graph as communication topology exhibits the fastest convergence.
|
5 |
Contagious knowledge : a study in the epistemology of testimonyWanderer, Jeremy January 2002 (has links)
Knowledge is contagious, at least in the sense that the testimony of others can, on occasions, be a source of knowledge. Theories of the epistemology of testimony attempt to account for this, and one can discern two broad themes emerging from the currently burgeoning literature. The first is an inferentialist conception, according to which the justification for testimonial-based beliefs is a form of inductive reasoning, involving appeal to the general reliability of testimony established either as a result of past experience or through a priori reasoning. The second is a transmission conception, according to which the original, non-testimonial justification for the belief is transmitted to the recipient through the act of learning from testimony. In the first part of the thesis, I argue that both conceptions are inadequate. The inferentialist conception fails to distinguish, as I argue it must, between the epistemology of testimony, and other instances of learning from others. The transmission conception ignores the central role that the notion of a perspective plays in epistemic practices. Further, both conceptions fail to take seriously the rich epistemic resources provided by an adequate account of the distinct, experiential state that one enters into as a result of understanding an act of testimony. In the second part of the thesis, I provide just such a rich conception of testimonial experience. Firstly, I defend an account of the epistemic role of perceptual experiential states. Secondly, I defend a parallel between perceptual and testimonial experiential states that allow for a similarity in epistemic role. Thirdly, I develop an account of the act of understanding others that is congenial to the notion of testimonial experience. The 'contagion' metaphor is particularly appropriate in light of the conception that emerges, allowing as it does for an epistemically direct account of acquiring knowledge through testimony.
|
6 |
Inferential reasoning during the psychodiagnostic assessment : attribution, hypothesis-testing strategies, and final inferences as a function of theoretical orientation, level of experience, and temporal orderGoodin Waxman, Tina January 1991 (has links)
Psychological problem representation, a complex task, is underpinned by clinicians' inferential processes, processes which are not immune to bias and logical error. Problem representation influenced by faulty reasoning can have deleterious effects on treatment planning and consequently on treatment outcome. Using a think-aloud procedure, 32 clinical and counselling psychologists examined a case file of the initial interview. The following three variables thought to contribute to clinicians' conceptualization of clients' problems were investigated: (1) theoretical orientation, (2) level of experience, and (3) temporal order of client information. The reasoning processes of psychodynamically-oriented and behaviourally-oriented clinicians were studied. The existence of expertise in psychotherapy was examined. Comparison of hypothesis generation among novices and experts was made. The impact of the temporal order of client information on the inferential processes of clinicians was also investigated. Subjects processed one of two versions of the case file. Levels of theoretical orientation, experience, and temporal order were compared in order to determine (1) whether clinicians posit significantly more dispositional hypotheses than contextual hypotheses, (2) whether initial hypotheses of a dispositional or contextual nature are confirmed or disconfirmed, and (3) whether initial hypotheses of a dispositional or contextual nature are retained or rejected. Based on absolute numbers, psychodynamicists posited more contextual inferences than behaviourists. Novices confirmed more dispositional inferences than experts. Clinicians rarely disconfirmed or rejected their initial hypotheses. Behaviourists retained more dispositional inferences than psychodynamicists. The order of the case material significantly affected the types of hypotheses generated and the hypothesis-testing strategies utilized. No significant differences, however, were found in the proportion of dispositio
|
7 |
Modeling multiple granularities of spatial objects /Ramalingam, Chitra, January 2002 (has links) (PDF)
Thesis (M.S.) in Spatial Information Science and Engineering--University of Maine, 2002. / Includes vita. Includes bibliographical references (leaves 118-124 ).
|
8 |
Modeling Multiple Granularities of Spatial ObjectsRamalingam, Chitra January 2002 (has links) (PDF)
No description available.
|
9 |
Adaptive inference and its application to protein modelingJiang, Bo, January 2008 (has links)
Thesis (M.S.E.C.E.)--University of Massachusetts Amherst, 2008. / Includes bibliographical references (p. 42-43).
|
10 |
Finding common support and assessing matching methods for causal inferenceMahmood, Sharif January 1900 (has links)
Doctor of Philosophy / Department of Statistics / Michael J. Higgins / This dissertation presents an approach to assess and validate causal inference tools to es- timate the causal effect of a treatment. Finding treatment effects in observational studies is complicated by the need to control for confounders. Common approaches for controlling include using prognostically important covariates to form groups of similar units containing both treatment and control units or modeling responses through interpolation. This disser- tation proposes a series of new, computationally efficient methods to improve the analysis of observational studies.
Treatment effects are only reliably estimated for a subpopulation under which a common support assumption holds—one in which treatment and control covariate spaces overlap. Given a distance metric measuring dissimilarity between units, a graph theory is used to find common support. An adjacency graph is constructed where edges are drawn between similar treated and control units to determine regions of common support by finding the largest connected components (LCC) of this graph. The results show that LCC improves on existing methods by efficiently constructing regions that preserve clustering in the data while ensuring interpretability of the region through the distance metric.
This approach is extended to propose a new matching method called largest caliper matching (LCM). LCM is a version of cardinality matching—a type of matching used to maximize the number of units in an observational study under a covariate balance constraint between treatment groups. While traditional cardinality matching is an NP-hard, LCM can be completed in polynomial time. The performance of LCM with other five popular matching methods are shown through a series of Monte Carlo simulations. The performance of the simulations is measured by the bias, empirical standard deviation and the mean square error of the estimates under different treatment prevalence and different distributions of covariates. The formed matched samples improve estimation of the population treatment effect in a wide range of settings, and suggest cases in which certain matching algorithms perform better than others. Finally, this dissertation presents an application of LCC and matching methods on a study of the effectiveness of right heart catheterization (RHC) and find that clinical outcomes are significantly worse for patients that undergo RHC.
|
Page generated in 0.0334 seconds