Spelling suggestions: "subject:"neurosciences.l'état processing"" "subject:"neurosciences.l’état processing""
1 |
Bayesian Modeling Strategies for Complex Data Structures, with Applications to Neuroscience and MedicineLu, Feihan January 2018 (has links)
Bayesian statistical procedures use probabilistic models and probability distributions to summarize data, estimate unknown quantities of interest, and predict future observations. The procedures borrow strength from other observations in the dataset by using prior distributions and/or hierarchical model specifications. The unique posterior sampling techniques can handle different issues, e.g., missing data, imputation, and extraction of parameters (and their functional forms) that would otherwise be difficult to address using conventional methods. In this dissertation, we propose Bayesian modeling strategies to address various challenges arising in the fields of neuroscience and medicine. Specifically, we propose a sparse Bayesian hierarchical Vector Autoregressive (VAR) model to map human brain connectivity using multi-subject multi-session functional magnetic resonance image (fMRI) data. We use the same model on patient diary databases, focusing on patient-level prediction of medical conditions using posterior predictive samples. We also propose a Bayesian model with an augmented Markov Chain Monte Carlo (MCMC) algorithm on repeat Electrical Stimulation Mappings (ESM) to evaluate the variability of localization in brain sites responsible for language function. We close by using Bayesian disproportionality analyses on spontaneous reporting system (SRS) databases for post-market drug safety surveillance, illustrating the caution required in real-world analysis and decision making.
|
2 |
Advances in Deep Generative Modeling With Applications to Image Generation and NeuroscienceLoaiza Ganem, Gabriel January 2019 (has links)
Deep generative modeling is an increasingly popular area of machine learning that takes advantage of recent developments in neural networks in order to estimate the distribution of observed data. In this dissertation we introduce three advances in this area. The first one, Maximum Entropy Flow Networks, allows to do maximum entropy modeling by combining normalizing flows with the augmented Lagrangian optimization method. The second one is the continuous Bernoulli, a new [0,1]-supported distribution which we introduce with the motivation of fixing the pervasive error in variational autoencoders of using a Bernoulli likelihood for non-binary data. The last one, Deep Random Splines, is a novel distribution over functions, where samples are obtained by sampling Gaussian noise and transforming it through a neural network to obtain the parameters of a spline. We apply these to model texture images, natural images and neural population data, respectively; and observe significant improvements over current state of the art alternatives.
|
Page generated in 0.122 seconds