Spelling suggestions: "subject:"bayesian"" "subject:"eayesian""
481 |
Bayesian ridge estimation of age-period-cohort modelsXu, Minle 02 October 2014 (has links)
Age-Period-Cohort models offer a useful framework to study trends of time-specific phenomena in various areas. Yet the perfect linear relationship among age, period, and cohort induces a singular design matrix and brings about the identification issue of age, period, and cohort model due to the identity Cohort = Period -- Age. Over the last few decades, multiple methods have been proposed to cope with the identification issue, e.g., the intrinsic estimator (IE), which may be viewed as a limiting form of ridge regression. This study views the ridge estimator from a Bayesian perspective by introducing a prior distribution(s) for the ridge parameter(s). Data used in this study describe the incidence rate of cervical cancer among Ontario women from 1960 to 1994. Results indicate that a Bayesian ridge model with a common prior for the ridge parameter yields estimates of age, period, and cohort effects similar to those based on the intrinsic estimator and to those based on a ridge estimator. The performance of Bayesian models with distinctive priors for the ridge parameters of age, period, and cohort effects is affected more by the choice of prior distributions. In sum, a Bayesian ridge model is an alternative way to deal with the identification problem of age, period, and cohort model. Future studies should further investigate the influences of different prior choices on Bayesian ridge models. / text
|
482 |
Predicting influenza hospitalizationsRamakrishnan, Anurekha 15 October 2014 (has links)
Seasonal influenza epidemics are a major public health concern, causing three to five million cases of severe illness and about 250,000 to 500,000 deaths worldwide. Given the unpredictability of these epidemics, hospitals and health authorities are often left unprepared to handle the sudden surge in demand. Hence early detection of disease activity is fundamental to reduce the burden on the healthcare system, to provide the most effective care for infected patients and to optimize the timing of control efforts. Early detection requires reliable forecasting methods that make efficient use of surveillance data. We developed a dynamic Bayesian estimator to predict weekly hospitalizations due to influenza related illnesses in the state of Texas. The prediction of peak hospitalizations using our model is accurate both in terms of number of hospitalizations and the time at which the peak occurs. For 1-to 8 week predictions, the predicted number of hospitalizations was within 8% of actual value and the predicted time of occurrence was within a week of actual peak. / text
|
483 |
Adaptive Bayesian P-splines models for fitting time-activity curves and estimating associated clinical parameters in Positron Emission Tomography and Pharmacokinetic studyJullion, Astrid 01 July 2008 (has links)
In clinical experiments, the evolution of a product concentration in tissue over time is often under study. Different products and tissues may be considered. For instance, one could analyse the evolution of drug concentration in plasma over time, by performing successive blood sampling from the subjects participating to the clinical study. One could also observe the evolution of radioactivity uptakes in different regions of the brain during a PET scan (Positron Emission Tomography). The global objective of this thesis is the modelling of such evolutions, which will be called, generically, pharmacokinetic curves (PK curves).
Some clinical measures of interest are derived from PK curves. For instance, when analysing the evolution of drug concentration in plasma, PK parameters such as the area under the curve (AUC), the maximal concentration (Cmax) and the time at which it occurs (tmax) are usually reported. In a PET study, one could measure Receptor Occupancy (RO) in some regions of the brain, i.e. the percentage of specific receptors to which the drug is bound. Such clinical measures may be badly estimated if the PK curves are noisy. Our objective is to provide statistical tools to get better estimations of the clinical measures of interest from appropriately smoothed PK curves.
Plenty of literature addresses the problem of PK curves fitting using parametric models. It usually relies on a compartmental approach to describe the kinetic of the product under study. The use of parametric models to fit PK curves can lead to problems in some specific cases. Firstly, the estimation procedures rely on algorithms which convergence can be hard to attain with sparse and/or noisy data. Secondly, it may be difficult to choose the adequate underlying compartmental model, especially when a new drug is under study and its kinetic is not well known.
The method that we advocate to fit such PK curves is based on Bayesian Penalized splines (P-splines): it provides good results both in terms of PK curves fitting and clinical measures estimations. It avoids the difficult choice of a compartmental model and is more robust than parametric models to a small sample size or a low signal to noise ratio. Working in a Bayesian context provides several advantages: prior information can be injected, models can easily be generalized and extended to hierarchical settings, and uncertainty for associated clinical parameters are straightforwardly derived from credible intervals obtained by MCMC methods. These are major advantages over traditional frequentist approaches.
|
484 |
"Explaining-Away" Effects in Rule-Learning: Evidence for Generative Probabilistic Inference in Infants and AdultsDawson, Colin Reimer January 2011 (has links)
The human desire to explain the world is the driving force behind our species' rich history of scientific and technological advancement. The ability of successive generations to build cumulatively on the scientific progress made by their ancestors rests on the ability of individual minds to rapidly assimilate the explanatory models developed by those who came before. But is this explanatory, model-based way of thinking limited to deliberate, conscious cognition, with the larger, unconscious portion of the workings of the mind dependent on simpler mechanisms of association and prediction, or is explanation a more fundamental drive? In this dissertation I explore theoretical, empirical and computational attempts to shed some light on this question. I first present a number of theoretical advantages that model-based learning has over its associative counterparts. I focus particularly on the inferential phenomenon of \emph{explaining away}, which is difficult to account for in a model-free system of learning. Next I review some recent empirical literature which helps to establish just what mechanisms of learning are available to human infants and adults, including a number of findings that suggest that there is more to learning than mere prediction. Among these are a number of experiments suggesting that explaining away occurs in a variety of cognitive domains. Having set the stage, I report a new set of experiments, one with infants and two with adults, along with a related computational model, which provide further evidence for unconscious explaining away, and hence for some for of model-based inference, in the domain of abstract, relational pattern-learning. In particular, I find that when learners are presented with a novel environment of tone sequences, the structure of their initial experience with that environment, and implicitly the model of the environment which best accounts for that experience, influences what kinds of abstract structure can easily be learned later. If indeed learners are able to construct explanatory models of particular domains of experience which are then used to learn the details of each domain, it may undermine claims by some philosophers and cognitive scientists that asymmetries in learning across domains constitutes evidence for an innately modular organization of the mind.
|
485 |
Probabilistic Control: Implications For The Development Of Upper Limb NeuroprostheticsAnderson, Chad January 2007 (has links)
Functional electrical stimulation (FES) involves artificial activation of paralyzed muscles via implanted electrodes. FES has been successfully used to improve the ability of tetraplegics to perform upper limb movements important for daily activities. The variety of movements that can be generated by FES is, however, limited to a few movements such as hand grasp and release. Ideally, a user of an FES system would have effortless command over all of the degrees of freedom associated with upper limb movement. One reason that a broader range of movements has not been implemented is because of the substantial challenge associated with identifying the patterns of muscle stimulation needed to elicit additional movements. The first part of this dissertation addresses this challenge by using a probabilistic algorithm to estimate the patterns of muscle activity associated with a wide range of upper limb movements.A neuroprosthetic involves the control of an external device via brain activity. Neuroprosthetics have been successfully used to improve the ability of tetraplegics to perform tasks important for interfacing with the world around them. The variety of mechanisms which they can control is, however, limited to a few devices such as special computer typing programs. Because motor areas of the cerebral cortex are known to represent and regulate voluntary arm movements it might be possible to sense this activity with electrodes and decipher this information in terms of a moment-by-moment representation of arm trajectory. Indeed, several methods for decoding neural activity have been described, but these approaches are encumbered by technical difficulties. The second part of this dissertation addresses this challenge by using similar probabilistic methods to extract arm trajectory information from electroencephalography (EEG) electrodes that are already chronically deployed and widely used in human subjects.Ultimately, the two approaches developed as part of this dissertation might serve as a flexible controller for interfacing brain activity with functional electrical stimulation systems to realize a brain-controlled upper-limb neuroprosthetic system capable of eliciting natural movements. Such a system would effectively bypass the injured region of the spinal cord and reanimate the arm, greatly increasing movement capability and independence in paralyzed individuals.
|
486 |
Assessing and Optimizing Pinhole SPECT Imaging Systems for Detection TasksGross, Kevin Anthony January 2006 (has links)
The subject of this dissertation is the assessment and optimization of image quality for multiple-pinhole, multiple-camera SPECT systems. These systems collect gamma-ray photons emitted from an object using pinhole apertures. Conventional measures of image quality, such as the signal-to-noise ratio or the modulation transfer function, do not predict how well a system's images can be used to perform a relevant task. This dissertation takes the stance that the ultimate measure of image quality is to measure how well images produced from a system can be used to perform a task. Furthermore, we recognize that image quality is inherently a statistical concept that must be assessed for the average task performance across a large ensemble of images.The tasks considered in this dissertation are detection tasks. Namely we consider detecting a known three-dimensional signal embedded in a three-dimensional stochastic object using the Bayesian ideal observer. Out of all possible observers (human or otherwise) the ideal observer sets the absolute upper bound for detection task performance by using all possible information in the image data. By employing a stochastic object model we can account for the effects of object variability, which has a large effect on observer performance.An imaging system whose hardware has been optimized for ideal observer detection task performance is an imaging system that maximally transfers detection task relevant information to the image data.The theory and simulation of image quality, detection tasks, and gamma-ray imaging are presented. Assessments of ideal observer detection task performance are used to optimize imaging hardware for SPECT systems as well as to rank different imaging system designs.
|
487 |
Preclassic Excavations at Punta de Chimino, Peten, Guatemala: Investigating Social Emplacement on an Early Maya LandscapeBachand, Bruce Robert January 2006 (has links)
Two excavation seasons in Punta de Chimino's E-Group Acropolis provide a record of monument construction, refurbishment, desecration, and abandonment. This evidence is used to explore the material dimensions of social emplacement--any act, event, practice, or behavior that affects the way a community and its descendants relate to a particular locality over time. The attributes and treatment of monuments are taken to signify cultural and political dispositions. An extensive overview of Preclassic and Protoclassic Maya archaeology situates Punta de Chimino's monumental remains in different historical settings. Bayesian analysis of the stratified sequence of radiocarbon and luminescence dates is used to accurately pinpoint the timing of specific cultural events. Stratigraphy and radiometry allow refinement of the Punta de Chimino ceramic sequence. In the end, varied lines of material evidence are garnered to infer changing social orientations toward Punta de Chimino's ceremonial precinct and the ancient Mesoamerican world at large.
|
488 |
Essays in Industrial OrganizationHawkins, Jenny Rae January 2011 (has links)
This dissertation consists of three essays evaluating topics in industrial organization. The first essay investigates a market structure or property regime in which a final good exists only by assembling multiple, monopoly-supplied components. In such dynamic settings, any sunk cost results in an outcome of hold-up, also known as tragedy of the anticommons. I design a model showing conditions for which two factors that reduce sunk cost, refunds and complementarities, mitigate hold-up. If the first component purchased has positive stand alone value or the first seller offers a full refund, hold-up is mitigated. My results suggest several policies that can mitigate inefficient outcomes in assembly problems, including legal requirements on full refunds, regulation on the purchasing order of components, and prohibition of price discrimination. The second essay applies Bayesian statistics to single-firm event studies used in securities litigation and antitrust investigations. Inference based on Bayesian analysis does not require an assumption of normality that potentially invalidates standard inference of classical single-firm event studies. I investigate ten events, five from actual securities litigation cases. Various Bayesian models, including replication of the frequentist approach, are examined. A flexible Bayesian model, replacing parametric likelihood functions with the empirical distribution function, also is explored. Our approach suggests an alternative, valid method for inference with easy implementation and interpretation. The third essay, motivated in the context of pharmaceutical advertising, analyzes demand rotations caused by an exogenously determined advertising parameter under Cournot oligopoly competition. We find that firms and consumers prefer extreme levels of advertising, but preferences for which extreme do not necessarily align. However, these differences can be alleviated with few or many firms in the market or cheap or expensive technologies. Therefore, advertising levels, regulated or not, might not serve consumers' best interests unless certain market attributes hold.
|
489 |
Statistical classification of magnetic resonance imaging dataAcosta Mena, Dionisio M. January 2001 (has links)
No description available.
|
490 |
Nonlinear Transformations and Filtering Theory for Space OperationsWeisman, Ryan Michael 1984- 14 March 2013 (has links)
Decisions for asset allocation and protection are predicated upon accurate knowledge of the current operating environment as well as correctly characterizing the evolution of the environment over time. The desired kinematic and kinetic states of objects in question cannot be measured directly in most cases and instead are inferred or estimated from available measurements using a filtering process. Often, nonlinear transformations between the measurement domain and desired state domain distort the state domain probability density function yielding a form which does not necessarily resemble the form assumed in the filtering algorithm. The distortion effect must be understood in greater detail and appropriately accounted for so that even if sensors, state estimation algorithms, and state propagation algorithms operate in different domains, they can all be effectively utilized without any information loss due to domain transformations.
This research presents an analytical investigation into understanding how non-linear transformations of stochastic, but characterizable, processes affect state and uncertainty estimation with direct application to space object surveillance and space- craft attitude determination. Analysis is performed with attention to construction of the state domain probability density function since state uncertainty and correlation are derived from the statistical moments of the probability density function. Analytical characterization of the effect nonlinear transformations impart on the structure of state probability density functions has direct application to conventional non- linear filtering and propagation algorithms in three areas: (1) understanding how smoothing algorithms used to estimate indirectly observed states impact state uncertainty, (2) justification or refutation of assumed state uncertainty distribution for more realistic uncertainty quantification, and (3) analytic automation of initial state estimate and covariance in lieu of user tuning.
A nonlinear filtering algorithm based upon Bayes’ Theorem is presented to ac- count for the impact nonlinear domain transformations impart on probability density functions during the measurement update and propagation phases. The algorithm is able to accommodate different combinations of sensors for state estimation which can also be used to hypothesize system parameters or unknown states from available measurements because information is able to appropriately accounted for.
|
Page generated in 0.0326 seconds