Spelling suggestions: "subject:"bayesian"" "subject:"eayesian""
511 |
Determinants of Economic Growth: A Bayesian Model AveragingKudashvili, Nikoloz January 2013 (has links)
MASTER THESIS Determinants of Economic Growth: A Bayesian Model Averaging Author: Bc. Nikoloz Kudashvili Abstract The paper estimates the economic growth determinants across 72 countries using a Bayesian Model Averaging. Unlike the other studies we include debt to GDP ratio as an explanatory variable among 29 growth determinants. For given values of the other variables debt to GDP ratio up to the threshold level is positively related with the growth rate. The coefficient on the ratio has nearly 0.8 posterior inclusion probability suggesting that debt to GDP ratio is an important long term growth determinant. We find that the initial level of GDP, life expectancy and equipment investments have a strong effect on the GDP per capita growth rate together with the debt to GDP ratio.
|
512 |
Input-output transformations in the awake mouse brain using whole-cell recordings and probabilistic analysisPuggioni, Paolo January 2015 (has links)
The activity of cortical neurons in awake brains changes dynamically as a function of the behavioural and attentional state. The primary motor cortex (M1) plays a central role in regulating complex motor behaviors. Despite a growing knowledge on its connectivity and spiking pattern, little is known about intra-cellular mechanism and rhythms underlying motor-command generation. In the last decade, whole-cell recordings in awake animals has become a powerful tool for characterising both sub-and supra-threshold activity during behaviour. Seminal in vivo studies have shown that changes in input structure and sub-threshold regime determine spike output during behaviour (input-output transformations). In this thesis I make use of computational and experimental techniques to better understand (i) how the brain regulates the sub-threshold activity of the neurons during movement and (ii) how this reflects in their input-output transformation properties. In the first part of this work I present a novel probabilistic technique to infer input statistics from in-vivo voltage-clamp traces. This approach, based on Bayesian belief networks, outperforms current methods and allows an estimation of synaptic input (i) kinetic properties, (ii) frequency, and (iii) weight distribution. I first validate the model on simulated data, thus I apply it to voltage-clamp recordings of cerebellar interneurons in awake mice. I found that synaptic weight distributions have long tails, which on average do not change during movement. Interestingly, the increase in synaptic current observed during movement is a consequence of the increase in input frequency only. In the second part, I study how the brain regulates the activity of pyramidal neurons in the M1 of awake mice during movement. I performed whole-cell recordings of pyramidal neurons in layer 5B (L5B), which represent one of the main descending output channels from motor cortex. I found that slow large-amplitude membrane potential fluctuations, typical of quiet periods, were suppressed in all L5B pyramidal neurons during movement, which by itself reduced membrane potential (Vm) variability, input sensitivity and output firing rates. However, a sub-population of L5B neurons ( 50%) concurrently experienced an increase in excitatory drive that depolarized mean Vm, enhanced input sensitivity and elevated firing rates. Thus, movement-related bidirectional modulation in L5B neurons is mediated by two opposing mechanisms: 1) a global reduction in network driven Vm variability and 2) a coincident, targeted increase in excitatory drive to a subpopulation of L5B neurons.
|
513 |
The effectiveness of hedge fund strategies and managers’ skills during market crises: a fuzzy, non-parametric and Bayesian analysis05 November 2012 (has links)
Ph.D. / This thesis investigates the persistence of hedge fund managers’ skills, the optimality of strategies they use to outperform consistently the market during periods of boom and/or recession, and the market risk encountered thereby. We consider a data set of monthly investment strategy indices published by Hedge Fund Research group. The data set spans from January 1995 to June 2010. We divide this sample period into four overlapping sub- sample periods that contain different economic market trends. We define a skilled manager as a manager who can outperform the market consistently during two consecutive sub-sample periods. To investigate the presence of managerial skills among hedge fund managers we first distinguish between outperformance, selectivity and market timing skills. We thereafter employ three different econometric models: frequentist, Bayesian and fuzzy regression, in order to estimate outperformance, selectivity and market timing skills using both linear and quadratic CAPM. Persistence in performance is carried out in three different fashions: contingence table, chi-square test and cross-sectional auto-regression technique. The results obtained with the first two probabilistic methods (frequentist and Bayesian) show that fund managers have skills to outperform the market during the period of positive economic growth (i.e. between sub-sample period 1 and sub-sample period 3). This market outperformance is due to both selectivity skill (during sub-sample period 2 and sub-sample period 3), and market timing skill (during sub-sample period 1 and sub- sample period 2). These results contradict the EMH and suggest that the “market is not always efficient,” it is possible to make abnormal rate of returns.However, the results obtained with the uncertainty fuzzy credibility method show that dispite the presence of few fund managers who possess selectivity skills during bull market period (sub-sample period 2 and sub-sample period 3), and market timing skills during recovery period (sub-sample period 3 and sub-sample period 4); there is no evidence of overall market outperformance during the entire sample period. Therefore the fuzzy credibility results support the appeal of the EMH according to which no economic agent can make risk-adjusted abnormal rate of return. The difference in findings obtained with the probabilistic method (frequentist and Bayesian) and uncertainty method (fuzzy credibility theory) is primarily due to the way uncertainty is modelled in the hedge fund universe in particular and in financial markets in general. Probability differs fundamentally from uncertainty: probability assumes that the total number of states of economy is known, whereas uncertainty assumes that the total number of states of economy is unknown. Furthermore, probabilistic methods rely on the assumption that asset returns are normally distributed and that transaction costs are negligible.
|
514 |
Detection of Deviations From Authorized Network Activity Using Dynamic Bayesian NetworksEwell, Cris Vincent 01 January 2011 (has links)
This research addressed one of the hard problems still plaguing the information security profession; detection of network activity deviations from authorized accounts when the deviations are similar to normal network activity. Specifically, when user and administrator type accounts are used for malicious activity, harm can come to the organization. Accurately modeling normal user network activity is hard to accomplish and detecting misuse is a complex problem.
Much work has been done in the past with intrusion detection systems, but being able to detect masquerade events with high accuracy and low false alarm rates continues to be an issue. Bayesian networks have been successfully used in the past to reason under certainty by combining prior knowledge with observed data. The use of dynamic Bayesian Networks, such as multi-entity Bayesian network, extends the capability and can address complex problems.
The goal of the research was to extend previous research with multi-entity Bayesian networks along with discretization methods to improve the effectiveness of the detection rate while maintaining an acceptable level of false positives. Preprocessing continuous variables has proven effective in prior research but has not been applied to multi-entity Bayesian networks in the past. Five different discretization methods were used in this research. Analysis using receiver operating characteristic curves, confusion matrix, and other comparison methods were completed as part of this research.
The results of the research demonstrated that a multi-entity Bayesian network model based on multiple data sources and the relationship between the user attributes could be used to detect unauthorized access to data. The supervised top down discretization methods had better performance related to the overall classification accuracy. Specifically, the class-attribute interdependence maximization discretization method outperformed the other four discretization methods. When compared to previous masquerade detection methods, the class-attribute interdependence maximization discretization method had a comparable true positive rate with a lower false positive rate.
|
515 |
On conjugate families and Jeffreys priors for von Mises-Fisher distributionsHornik, Kurt, Grün, Bettina January 2013 (has links) (PDF)
This paper discusses characteristics of standard conjugate priors and their induced
posteriors in Bayesian inference for von Mises-Fisher distributions, using either the
canonical natural exponential family or the more commonly employed polar coordinate
parameterizations. We analyze when standard conjugate priors as well as posteriors are
proper, and investigate the Jeffreys prior for the von Mises-Fisher family. Finally, we
characterize the proper distributions in the standard conjugate family of the (matrixvalued)
von Mises-Fisher distributions on Stiefel manifolds.
|
516 |
Bowhead whale localization and environmental characterization in the Chukchi Sea using nonlinear Bayesian inversionWarner, Graham Andrew 09 September 2016 (has links)
This thesis develops and applies nonlinear Bayesian inversion methods for localization of bowhead whales and environmental characterization, with quantitative uncertainty estimation, based on acoustic measurements from a set of asynchronous single-channel recorders in the Chukchi Sea. Warping analysis is applied to estimate modal-dispersion data from airgun sources and whale calls. Whale locations and the water-column sound-speed profile (SSP) and seabed geoacoustic properties are estimated using reversible-jump Markov-chain Monte Carlo sampling in trans-dimensional inversions that account for uncertainty in the number of SSP nodes and subbottom layers. The estimated SSP and seafloor sound speed are in excellent agreement with independent estimates, and whale localization uncertainties decrease substantially when jointly-inverting data from multiple whale calls. Bowhead whales are also localized using a fixed-dimensional inversion of time-difference-of-arrival data derived using cross-correlation for the same recorders. The nonlinear localization uncertainty estimates are found to depend strongly on the source locations and receiver geometry. / Graduate
|
517 |
Sensor Planning for Bayesian Nonparametric Target ModelingWei, Hongchuan January 2016 (has links)
<p>Bayesian nonparametric models, such as the Gaussian process and the Dirichlet process, have been extensively applied for target kinematics modeling in various applications including environmental monitoring, traffic planning, endangered species tracking, dynamic scene analysis, autonomous robot navigation, and human motion modeling. As shown by these successful applications, Bayesian nonparametric models are able to adjust their complexities adaptively from data as necessary, and are resistant to overfitting or underfitting. However, most existing works assume that the sensor measurements used to learn the Bayesian nonparametric target kinematics models are obtained a priori or that the target kinematics can be measured by the sensor at any given time throughout the task. Little work has been done for controlling the sensor with bounded field of view to obtain measurements of mobile targets that are most informative for reducing the uncertainty of the Bayesian nonparametric models. To present the systematic sensor planning approach to leaning Bayesian nonparametric models, the Gaussian process target kinematics model is introduced at first, which is capable of describing time-invariant spatial phenomena, such as ocean currents, temperature distributions and wind velocity fields. The Dirichlet process-Gaussian process target kinematics model is subsequently discussed for modeling mixture of mobile targets, such as pedestrian motion patterns. </p><p>Novel information theoretic functions are developed for these introduced Bayesian nonparametric target kinematics models to represent the expected utility of measurements as a function of sensor control inputs and random environmental variables. A Gaussian process expected Kullback Leibler divergence is developed as the expectation of the KL divergence between the current (prior) and posterior Gaussian process target kinematics models with respect to the future measurements. Then, this approach is extended to develop a new information value function that can be used to estimate target kinematics described by a Dirichlet process-Gaussian process mixture model. A theorem is proposed that shows the novel information theoretic functions are bounded. Based on this theorem, efficient estimators of the new information theoretic functions are designed, which are proved to be unbiased with the variance of the resultant approximation error decreasing linearly as the number of samples increases. Computational complexities for optimizing the novel information theoretic functions under sensor dynamics constraints are studied, and are proved to be NP-hard. A cumulative lower bound is then proposed to reduce the computational complexity to polynomial time.</p><p>Three sensor planning algorithms are developed according to the assumptions on the target kinematics and the sensor dynamics. For problems where the control space of the sensor is discrete, a greedy algorithm is proposed. The efficiency of the greedy algorithm is demonstrated by a numerical experiment with data of ocean currents obtained by moored buoys. A sweep line algorithm is developed for applications where the sensor control space is continuous and unconstrained. Synthetic simulations as well as physical experiments with ground robots and a surveillance camera are conducted to evaluate the performance of the sweep line algorithm. Moreover, a lexicographic algorithm is designed based on the cumulative lower bound of the novel information theoretic functions, for the scenario where the sensor dynamics are constrained. Numerical experiments with real data collected from indoor pedestrians by a commercial pan-tilt camera are performed to examine the lexicographic algorithm. Results from both the numerical simulations and the physical experiments show that the three sensor planning algorithms proposed in this dissertation based on the novel information theoretic functions are superior at learning the target kinematics with</p><p>little or no prior knowledge</p> / Dissertation
|
518 |
Bayesian confirmation Theory and all of the Sciences : A unified approachStålenheim, Julia January 2019 (has links)
The aim of this thesis essay is to show that Bayesian confirmation theory can be used instead of the hypothetico-deductive method for all sciences.Føllesdal argues in his paper "Hermeneutics and the hypotheticodeductive method" that the hypothetico-deductive method is used in all sciences, only more or less visible and this paper takes his arguments and applies the same reasoning to Bayesian confirmation theory. To do this, an an example of literature study is gone through where the Bayesian confirmation theory is applied. The cases of confirmation and disconfirmation from the hypothetico-deductive method are worked through in terms of Bayes' theorem and the conclusions are that Bayesian confirmation theory can be used with respect to all sciences and that it might prove even better since its higher adaptability.
|
519 |
A study of the prediction performance and multivariate extensions of the horseshoe estimatorYunfan Li (6624032) 14 May 2019 (has links)
The horseshoe prior has been shown to successfully handle high-dimensional sparse estimation problems. It both adapts to sparsity efficiently and provides nearly unbiased estimates for large signals. In addition, efficient sampling algorithms have been developed and successively applied to a vast array of high-dimensional sparse estimation problems. In this dissertation, we investigate the prediction performance of the horseshoe prior in sparse regression, and extend the horseshoe prior to two multivariate settings.<br><br>We begin with a study of the finite sample prediction performance of shrinkage regression methods, where the risk can be unbiasedly estimated using Stein's approach. We show that the horseshoe prior achieves an improved prediction risk over global shrinkage rules, by using a component-specific local shrinkage term that is learned from the data under a heavy-tailed prior, in combination with a global term providing shrinkage towards zero. We demonstrate improved prediction performance in a simulation study and in a pharmacogenomics data set, confirming our theoretical findings.<br><br>We then shift to extending the horseshoe prior to handle two high-dimensional multivariate problems. First, we develop a new estimator of the inverse covariance matrix for high-dimensional multivariate normal data. The proposed graphical horseshoe estimator has attractive properties compared to other popular estimators. The most prominent benefit is that when the true inverse covariance matrix is sparse, the graphical horseshoe estimator provides estimates with small information divergence from the sampling model. The posterior mean under the graphical horseshoe prior can also be almost unbiased under certain conditions. In addition to these theoretical results, we provide a full Gibbs sampler for implementation. The graphical horseshoe estimator compares favorably to existing techniques in simulations and in a human gene network data analysis.<br><br>In our second setting, we apply the horseshoe prior to the joint estimation of regression coefficients and the inverse covariance matrix in normal models. The computational challenge in this problem is due to the dimensionality of the parameter space that routinely exceeds the sample size. We show that the advantages of the horseshoe prior in estimating a mean vector, or an inverse covariance matrix, separately are also present when addressing both simultaneously. We propose a full Bayesian treatment, with a sampling algorithm that is linear in the number of predictors. Extensive performance comparisons are provided with both frequentist and Bayesian alternatives, and both estimation and prediction performances are verified on a genomic data set.
|
520 |
Generalizing the number of states in Bayesian belief propagation, as applied to portfolio management.Kruger, Jan Walters. January 1996 (has links)
A research report submitted to the Faculty of Science, University of the
Witwatersrand, Johannesburg, in partial fulfillment of the requirements for the
degree of Master of' Science. / This research report describes the use or the Pearl's algorithm in Bayesian belief
networks to induce a belief network from a database. With a solid grounding in
probability theory, the Pearl algorithm allows belief updating by propagating
likelihoods of leaf nodes (variables) and the prior probabilities.
The Pearl algorithm was originally developed for binary variables and a
generalization to more states is investigated.
The data 'Used to test this new method, in a Portfolio Management context, are the
Return and various attributes of companies listed on the Johannesburg Stock
Exchange ( JSE ).
The results of this model is then compared to a linear regression model. The
bayesian method is found to perform better than a linear regression approach. / Andrew Chakane 2018
|
Page generated in 0.0786 seconds