• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19
  • 6
  • 5
  • 3
  • 1
  • Tagged with
  • 48
  • 14
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On Independent Reference Priors

Lee, Mi Hyun 09 January 2008 (has links)
In Bayesian inference, the choice of prior has been of great interest. Subjective priors are ideal if sufficient information on priors is available. However, in practice, we cannot collect enough information on priors. Then objective priors are a good substitute for subjective priors. In this dissertation, an independent reference prior based on a class of objective priors is examined. It is a reference prior derived by assuming that the parameters are independent. The independent reference prior introduced by Sun and Berger (1998) is extended and generalized. We provide an iterative algorithm to derive the general independent reference prior. We also propose a sufficient condition under which a closed form of the independent reference prior is derived without going through the iterations in the iterative algorithm. The independent reference prior is then shown to be useful in respect of the invariance and the first order matching property. It is proven that the independent reference prior is invariant under a type of one-to-one transformation of the parameters. It is also seen that the independent reference prior is a first order probability matching prior under a sufficient condition. We derive the independent reference priors for various examples. It is observed that they are first order matching priors and the reference priors in most of the examples. We also study an independent reference prior in some types of non-regular cases considered by Ghosal (1997). / Ph. D.
2

Acquisition and influence of expectations about visual speed

Sotiropoulos, Grigorios January 2016 (has links)
It has been long hypothesized that due to the inherent ambiguities of visual input and the limitations of the visual system, vision is a form of “unconscious inference” whereby the brain relies on assumptions (aka expectations) to interpret the external world. This hypothesis has been recently formalized into Bayesian models of perception (the “Bayesian brain”) that represent these expectations as prior probabilities. In this thesis, I focus on a particular kind of expectation that humans are thought to possess – that objects in the world tend to be still or move slowly – known as the “slow speed prior”. Through a combination of experimental and theoretical work, I investigate how the speed prior is acquired and how it impacts motion perception. The first part of my work consists of an experiment where subjects are exposed to simple "training" stimuli moving more often at high speeds than at low speeds. By subsequently testing the subjects with slow-moving stimuli of high uncertainty (low contrast), I find that their perception gradually changes in a manner consistent with the progressive acquisition of an expectation that favours progressively higher speeds. Thus subjects appear to gradually internalize the speed statistics of the stimulus ensemble over the duration of the experiment. I model these results using an existing Bayesian model of motion perception that incorporates a speed prior with a peak at zero, extending the model so that the mean gradually shifts away from zero. Although the first experiment presents evidence for the plasticity of the speed prior, the experimental paradigm and the constraints of the model limit the accuracy and precision in the reconstruction of observers’ priors. To address these limitations, I perform a different experiment where subjects compare the speed of moving gratings of different contrasts. The new paradigm allows more precise measurements of the contrast-dependent biases in perceived speed. Using a less constrained Bayesian model, I extract the priors of subjects and find considerable interindividual variability. Furthermore, noting that the Bayesian model cannot account for certain subtleties in the data, I combine the model with a non-Bayesian, physiologically motivated model of speed tuning of cortical neurons and show that the combination offers an improved description of the data. Using the paradigm of the second experiment, I then explore the role of visual experience on the form of the speed prior. By recruiting avid video gamers (who are routinely exposed to high speeds) and nongamers of both sexes, I study the differences in the prior among groups and find, surprisingly, that subjects’ speed priors depend more on gender than on gaming experience. In a final series of experiments similar to the first, I also test subjects on variations of the trained stimulus configuration – namely different orientations and motion directions. Subjects’ responses suggest that they are able to apply the changed prior to different orientations and, furthermore, that the changed prior persists for at least a week after the end of the experiment. These results provide further support for the plasticity of the speed prior but also suggest that the learned prior may be used only across similar stimulus configurations, whereas in sufficiently different configurations or contexts a “default” prior may be used instead.
3

Development of Informative Priors in Microarray Studies

Fronczyk, Kassandra M. 19 July 2007 (has links) (PDF)
Microarrays measure the abundance of DNA transcripts for thousands of gene sequences, simultaneously facilitating genomic comparisons across tissue types or disease status. These experiments are used to understand fundamental aspects of growth and development and to explore the underlying genetic causes of many diseases. The data from most microarray studies are found in open-access online databases. Bayesian models are ideal for the analysis of microarray data because of their ability to integrate prior information; however, most current Bayesian analyses use empirical or flat priors. We present a Perl script to build an informative prior by mining online databases for similar microarray experiments. Four prior distributions are investigated: a power prior including information from multiple previous experiments, an informative prior using information from one previous experiment, an empirically estimated prior, and a flat prior. The method is illustrated with a two-sample experiment to determine the preferential regulation of genes by tamoxifen in breast cancer cells.
4

Objective Bayesian Analysis of Kullback-Liebler Divergence of two Multivariate Normal Distributions with Common Covariance Matrix and Star-shape Gaussian Graphical Model

Li, Zhonggai 22 July 2008 (has links)
This dissertation consists of four independent but related parts, each in a Chapter. The first part is an introductory. It serves as the background introduction and offer preparations for later parts. The second part discusses two population multivariate normal distributions with common covariance matrix. The goal for this part is to derive objective/non-informative priors for the parameterizations and use these priors to build up constructive random posteriors of the Kullback-Liebler (KL) divergence of the two multivariate normal populations, which is proportional to the distance between the two means, weighted by the common precision matrix. We use the Cholesky decomposition for re-parameterization of the precision matrix. The KL divergence is a true distance measurement for divergence between the two multivariate normal populations with common covariance matrix. Frequentist properties of the Bayesian procedure using these objective priors are studied through analytical and numerical tools. The third part considers the star-shape Gaussian graphical model, which is a special case of undirected Gaussian graphical models. It is a multivariate normal distribution where the variables are grouped into one "global" group of variable set and several "local" groups of variable set. When conditioned on the global variable set, the local variable sets are independent of each other. We adopt the Cholesky decomposition for re-parametrization of precision matrix and derive Jeffreys' prior, reference prior, and invariant priors for new parameterizations. The frequentist properties of the Bayesian procedure using these objective priors are also studied. The last part concentrates on the discussion of objective Bayesian analysis for partial correlation coefficient and its application to multivariate Gaussian models. / Ph. D.
5

Service-Based Approach for Intelligent Agent Frameworks

Mora, Randall P., Hill, Jerry L. 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / This paper describes a service-based Intelligent Agent (IA) approach for machine learning and data mining of distributed heterogeneous data streams. We focus on an open architecture framework that enables the programmer/analyst to build an IA suite for mining, examining and evaluating heterogeneous data for semantic representations, while iteratively building the probabilistic model in real-time to improve predictability. The Framework facilitates model development and evaluation while delivering the capability to tune machine learning algorithms and models to deliver increasingly favorable scores prior to production deployment. The IA Framework focuses on open standard interoperability, simplifying integration into existing environments.
6

Information structures and their effects on consumption decisions and prices

Moreno González, Othón M. 06 November 2013 (has links)
This work analyzes the effects that different information structures on the demand side of the market have on consumption decisions and the way prices are determined. We develop three theoretical models to address this issue in a systematic way. First, we focus our attention on the consumers' awareness, or lack thereof, of substitute products in the market and the strategic interaction between firms competing in prices and costly advertising in such an environment. We find that prior information held by consumers can drastically change the advertising equilibrium predictions. In particular, we provide sufficient conditions for the existence of three types of equilibria, in addition to one previously found in the literature, and provide a necessary condition for a fourth type of equilibrium. Additionally, we show that the effect of the resulting advertising strategies on the expected transaction price is qualitatively significant, although ambiguous when compared to the case of a newly formed market. We can establish, however, that the transaction price is increasing in the size of the smaller firm's captive market. In the second chapter, we study the optimal timing to buy a durable good with an embedded option to resell it at some point in the future, as well as its reservation price, where the agent faces Knightian uncertainty about the process generating the market prices. The problem is modeled as a stopping problem with multiple priors in continuous time with infinite horizon. We find that the direction of the change in the buyer's reservation price depends on the particular parametrization of the model. Furthermore, the change in the buying threshold due to an increase in ambiguity is greater as the fraction of the market at which the agent can resell the good decreases, and the value of the embedded option is decreasing in the perceived level of ambiguity. Finally, we introduce Knightian uncertainty to a model of price search by letting the consumers be ambiguous regarding the industry's cost of production. We characterize the equilibria of this game for high and low levels of the search cost and show that firms extract abnormal profits for low realizations of the marginal cost. Furthermore, we show that, as the search cost goes to zero, the equilibrium of the game under the low cost regime does not converge to the Bertrand marginal-cost pricing. Instead firms follow a mixed-strategy that includes all prices between the high and low production costs. / text
7

Threshold cointegration and adaptive shrinkage

Huber, Florian, Zörner, Thomas 06 1900 (has links) (PDF)
This paper considers Bayesian estimation of the threshold vector error correction (TVECM) model in moderate to large dimensions. Using the lagged cointegrating error as a threshold variable gives rise to additional difficulties that are typically solved by relying on large sample approximations. Relying on Markov chain Monte Carlo methods we circumvent these issues by avoiding computationally prohibitive estimation strategies like the grid search. Due to the proliferation of parameters we use novel global-local shrinkage priors in the spirit of Griffin and Brown (2010). We illustrate the merits of our approach in an application to five exchange rates vis-á-vis the US dollar and assess whether a given currency is over or undervalued. Moreover, we perform a forecasting comparison to investigate whether it pays off to adopt a non-linear modeling approach relative to a set of simpler benchmark models. / Series: Department of Economics Working Paper Series
8

Investigação da reconstrução tomográfica utilizando transdutores distribuídos de ultrassom. / Reconstruction tomography investigation using ultrasound distributed transducers.

Cardona Cardenas, Diego Armando 17 January 2018 (has links)
A Ultrassonografia é uma ferramenta que vem sendo bastante utilizada pelas equipes médicas para diagnosticar e monitorar diferentes doenças. Isto pode ser explicado pelo fato de ser não invasiva e ser livre de radiação ionizante. A tomografia por ultrassom (USCT), uma das classes de ultrassonografia, é apresentada como uma alternativa promissora, de baixo custo, na avaliação de patologias e tumores nas glândulas mamárias. Apesar disso, a eficiência dos algoritmos desenvolvidos para o USCT depende tanto dos seus parâmetros iniciais como das características dos objetos dentro do meio de propagação (refletividade, tamanho, contraste). Para melhorar os resultados dos algoritmos de USCT é comum inicializar estes algoritmos com informação anatômica da região a ser reconstruída (Priors). Apesar das melhoras, para baixos contrastes, os efeitos das alterações nos Priors sobre estes algoritmos não são claros, e além disso, não existem estudos sobre a geração e uso de Priors para altos contrastes. Neste trabalho foi investigada a reconstrução tomográfica quantitativa por ultrassom, desde informações provenientes da reflexão, transmissão e espalhamento das ondas de ultrassom, com o intuito de diminuir o erro nos algoritmos de USCT e gerar melhores Priors para múltiplos contrastes. Para este propósito, através de simulações, foram estudadas técnicas que usam a reflexão como caminho para conhecer regiões (máscara por reflexão) ou para inferir bordas dos objetos dentro do meio (Abertura Sintética de Transmissão (STA)), técnicas que assumem transmissão linear do som oferecendo uma ideia da velocidade dentro do meio (Técnicas de Reconstrução Algébrica (ART)) e algoritmos que usam a difração do som (Distorted Born Iterative Method (DBIM)) para, segundo certos limites, inferir melhor tanto bordas como velocidade dos objetos dentro do meio. Também foi analisada como esta última técnica se comporta diante de diversas inicializações (Priors). Como resultados e conclusões neste trabalho mostrou-se: como o aumento do contraste no meio gera os piores resultados do DBIM; perante a boas inicializações do meio de propagação, o algoritmo, independentemente do contraste, tende a gerar boas reconstruções; o uso de estratégias que delimitem ou diminuam o número de variáveis a serem encontradas (máscara por reflexão) junto com o DBIM possibilita uma convergência mais rápida e melhora desempenho deste; inicializar os objetos dentro do meio de propagação (Priors) com áreas maiores do que as esperadas, oferece melhores resultados no DBIM do que trabalhar com áreas menores; informações qualitativas provenientes da reflexão (STA) são relevantes e aumentam a sua importância conforme aumenta o contraste estudado; através dos algoritmos ART, é possível uma delimitação inicial dos objetos dentro do meio para certos contrastes. Estas informações quantitativas podem ser melhoradas por meio da execução conjunta do ART com uma variação do Modified Median Filter aqui proposta. / Ultrasonography is a tool that has been used by medical professionals to diagnose and to monitor different kinds of diseases. This can be explained by its characteristics, such as being non-invasive and being free of ionizing radiation. Ultrasound Tomography (USCT) is one of the classes of ultrasonography, and is presented as a promising low cost alternative in the evaluation of pathologies and tumors in the breast. However, the efficiency of the USCT-algorithms depends both on its initial parameters and of the objects characteristics within the propagation medium (reflectivity, size, contrast). To improve the results of the USCT-algorithms it is common to initialize the algorithms with a-priori anatomical information of the region to be reconstructed (Priors). Despite of improving the results of the USCT-algorithms for low contrasts, the effects of the Priors in these algorithms are not clear, and in addition, there are no studies about the generation and the use of Priors for high contrasts. In this work, quantitative reconstruction for ultrasound was investigated based on information from the reflection, transmission and scattering of ultrasound waves, in order to reduce the error in the USCT-algorithms and to generate better Priors for multiple contrasts. For this purpose, it was studied, through simulations, techniques that use reflection to differentiate regions (reflection mask), or to deduce objects borders within the propagation medium (synthetic transmission aperture (STA)), as well as techniques that assume linear sound transmission to get an idea of the velocity inside the propagation medium (algebraic reconstruction technique (ART)) and algorithms that use sound diffraction (Distorted Born Iterative Method (DBIM)) to better infer both edges and velocity of objects within the propagation medium. It was also analyzed how the DBIM behaves due to multiple initializations (Priors). As results and conclusions, it was shown: how the increase of contrast in the propagation medium generates the worse results of the DBIM; in the presence of a good initialization of the propagation medium, the DBIM, regardless of the contrast, tends to generate good reconstructions; the use of strategies that delimit or reduce the number of unknown variables (reflection mask) along with the DBIM enables fast convergence and it improves the DBIM\'s performance; initializing the objects within the propagation medium with areas larger than expected provides better DBIM results than working with smaller areas; qualitative information derived from the reflection (STA) are relevant and increase their importance as the contrast increases; initial delimitation of objects within the propagation medium for certain contrasts is possible via transmission reconstruction. This quantitative information can be improved through the implementation of ART together with a variation of the Modified Median Filter here proposed.
9

Flexible shrinkage in high-dimensional Bayesian spatial autoregressive models

Pfarrhofer, Michael, Piribauer, Philipp January 2019 (has links) (PDF)
Several recent empirical studies, particularly in the regional economic growth literature, emphasize the importance of explicitly accounting for uncertainty surrounding model specification. Standard approaches to deal with the problem of model uncertainty involve the use of Bayesian model-averaging techniques. However, Bayesian model-averaging for spatial autoregressive models suffers from severe drawbacks both in terms of computational time and possible extensions to more flexible econometric frameworks. To alleviate these problems, this paper presents two global-local shrinkage priors in the context of high-dimensional matrix exponential spatial specifications. A simulation study is conducted to evaluate the performance of the shrinkage priors. Results suggest that they perform particularly well in high-dimensional environments, especially when the number of parameters to estimate exceeds the number of observations. Moreover, we use pan-European regional economic growth data to illustrate the performance of the proposed shrinkage priors.
10

Exchange rate dynamics and monetary policy - Evidence from a non-linear DSGE-VAR approach

Huber, Florian, Rabitsch, Katrin 10 1900 (has links) (PDF)
In this paper, we reconsider the question how monetary policy influences exchange rate dynamics. To this end, a vector autoregressive (VAR) model is combined with a two-country dynamic stochastic general equilibrium (DSGE) model. Instead of focusing exclusively on how monetary policy shocks affect the level of exchange rates, we also analyze how they impact exchange rate volatility. Since exchange rate volatility is not observed, we estimate it alongside the remaining quantities in the model. Our findings can be summarized as follows. Contractionary monetary policy shocks lead to an appreciation of the home currency, with exchange rate responses in the short-run typically undershooting their long-run level of appreciation. They also lead to an increase in exchange rate volatility. Historical and forecast error variance decompositions indicate that monetary policy shocks explain an appreciable amount of exchange rate movements and the corresponding volatility. / Series: Department of Economics Working Paper Series

Page generated in 0.0456 seconds