Bayesian Generalized Nonlinear Models (BGNLM) offer a flexible alternative to GLM while still providing better interpretability than machine learning techniques such as neural networks. In BGNLM, the methods of Bayesian Variable Selection and Model Averaging are applied in an extended GLM setting. Models are fitted to data using MCMC within a genetic framework in an algorithm called GMJMCMC. In this thesis, we present a new implementation of the algorithm as a package in the programming language R. We also present a novel algorithm called S-IRLS-SGD for estimating the MLE of a GLM by subsampling the data. Finally, we present some theory combining the novel algorithm with GMJMCMC/MJMCMC/MCMC and a number of experiments demonstrating the performance of the contributed algorithm.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:su-194715 |
Date | January 2021 |
Creators | Lachmann, Jon |
Publisher | Stockholms universitet, Statistiska institutionen |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Page generated in 0.0019 seconds