• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 6
  • 1
  • Tagged with
  • 25
  • 25
  • 10
  • 10
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Some Bayesian Methods in the Estimation of Parameters in the Measurement Error Models and Crossover Trial

Wang, Guojun 31 March 2004 (has links)
No description available.
22

Semiparametric Bayesian Approach using Weighted Dirichlet Process Mixture For Finance Statistical Models

Sun, Peng 07 March 2016 (has links)
Dirichlet process mixture (DPM) has been widely used as exible prior in nonparametric Bayesian literature, and Weighted Dirichlet process mixture (WDPM) can be viewed as extension of DPM which relaxes model distribution assumptions. Meanwhile, WDPM requires to set weight functions and can cause extra computation burden. In this dissertation, we develop more efficient and exible WDPM approaches under three research topics. The first one is semiparametric cubic spline regression where we adopt a nonparametric prior for error terms in order to automatically handle heterogeneity of measurement errors or unknown mixture distribution, the second one is to provide an innovative way to construct weight function and illustrate some decent properties and computation efficiency of this weight under semiparametric stochastic volatility (SV) model, and the last one is to develop WDPM approach for Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) model (as an alternative approach for SV model) and propose a new model evaluation approach for GARCH which produces easier-to-interpret result compared to the canonical marginal likelihood approach. In the first topic, the response variable is modeled as the sum of three parts. One part is a linear function of covariates that enter the model parametrically. The second part is an additive nonparametric model. The covariates whose relationships to response variable are unclear will be included in the model nonparametrically using Lancaster and Šalkauskas bases. The third part is error terms whose means and variance are assumed to follow non-parametric priors. Therefore we denote our model as dual-semiparametric regression because we include nonparametric idea for both modeling mean part and error terms. Instead of assuming all of the error terms follow the same prior in DPM, our WDPM provides multiple candidate priors for each observation to select with certain probability. Such probability (or weight) is modeled by relevant predictive covariates using Gaussian kernel. We propose several different WDPMs using different weights which depend on distance in covariates. We provide the efficient Markov chain Monte Carlo (MCMC) algorithms and also compare our WDPMs to parametric model and DPM model in terms of Bayes factor using simulation and empirical study. In the second topic, we propose an innovative way to construct weight function for WDPM and apply it to SV model. SV model is adopted in time series data where the constant variance assumption is violated. One essential issue is to specify distribution of conditional return. We assume WDPM prior for conditional return and propose a new way to model the weights. Our approach has several advantages including computational efficiency compared to the weight constructed using Gaussian kernel. We list six properties of this proposed weight function and also provide the proof of them. Because of the additional Metropolis-Hastings steps introduced by WDPM prior, we find the conditions which can ensure the uniform geometric ergodicity of transition kernel in our MCMC. Due to the existence of zero values in asset price data, our SV model is semiparametric since we employ WDPM prior for non-zero values and parametric prior for zero values. On the third project, we develop WDPM approach for GARCH type model and compare different types of weight functions including the innovative method proposed in the second topic. GARCH model can be viewed as an alternative way of SV for analyzing daily stock prices data where constant variance assumption does not hold. While the response variable of our SV models is transformed log return (based on log-square transformation), GARCH directly models the log return itself. This means that, theoretically speaking, we are able to predict stock returns using GARCH models while this is not feasible if we use SV model. Because SV models ignore the sign of log returns and provides predictive densities for squared log return only. Motivated by this property, we propose a new model evaluation approach called back testing return (BTR) particularly for GARCH. This BTR approach produces model evaluation results which are easier to interpret than marginal likelihood and it is straightforward to draw conclusion about model profitability by applying this approach. Since BTR approach is only applicable to GARCH, we also illustrate how to properly cal- culate marginal likelihood to make comparison between GARCH and SV. Based on our MCMC algorithms and model evaluation approaches, we have conducted large number of model fittings to compare models in both simulation and empirical study. / Ph. D.
23

Modelos de resposta ao item com função de ligação t - assimétrica.

Pinheiro, Alessandra Noeli Craveiro 20 April 2007 (has links)
Made available in DSpace on 2016-06-02T20:05:59Z (GMT). No. of bitstreams: 1 DissANCP.pdf: 696592 bytes, checksum: 1733e6a92a2421365932309fcb98d372 (MD5) Previous issue date: 2007-04-20 / The Item Response Theory (IRT) is a set of mathematical models representing the probability of an individual to take a correct response of an item and its ability. The purpose of our research is to show the models formulated in the IRT under the skew-normal distributions and to develop flexible alternative models. With this goal in mind we introduced the t-skew distributions (Azzalini et al. 1999) and results similar to Bazan s results are obtained. Some applications using Bayesian methods are also considered. / A Teoria de Resposta ao Item (TRI) e um conjunto de modelos matematicos que representam a probabilidade de um indivıduo dar uma resposta certa a um item (questao) como funcao dos parametros do item e da habilidade do indivıduo. O objetivo de nossa pesquisa e apresentar os modelos propostos na TRI normal assimetrica e desenvolver modelos alternativos mais flexıveis. Com esta finalidade em mente, introduzimos a distribuicao t-assimetrica (Azzalini e Capitanio 1999) e obtemos resultados similares aos obtidos por Bazan (2005). Algumas aplicacoes utilizando metodos bayesianos sao consideradas.
24

Essays in empirical finance

Farouh, Magnim 08 1900 (has links)
Cette thèse comporte trois chapitres dans lesquels j'étudie les coûts de transaction des actions, les anomalies en finance et les activités du système bancaire parallèle. Dans le premier chapitre (co-écrit avec René Garcia), une nouvelle façon d'estimer les coûts de transaction des actions est proposée. Les coûts de transaction ont diminué au fil du temps, mais ils peuvent augmenter considérablement lorsque la liquidité de financement se raréfi e, lorsque les craintes des investisseurs augmentent ou lorsqu'il y a d'autres frictions qui empêchent l'arbitrage. Nous estimons dans ce chapitre les écarts entre les cours acheteur et vendeur des actions de milliers d'entreprises à une fréquence journalière et présentons ces mouvements importants pour plusieurs de ces épisodes au cours des 30 dernières années. Le coût de transaction des trois quarts des actions est fortement impacté par la liquidité de fi nancement et augmente en moyenne de 24%. Alors que les actions des petites entreprises et celles des entreprises à forte volatilité ont des coûts de transaction plus élevés, l'augmentation relative des coûts de transaction en temps de crise est plus prononcée pour les actions des grandes entreprises et celles des entreprises à faible volatilité. L'écart entre les coûts de transaction respectifs de ces groupes de qualité élevée et qualité faible augmente également lorsque les conditions financières se détériorent, ce qui prouve le phénomène de fuite vers la qualité. Nous avons construit des portefeuilles basés sur des anomalies et avons estimé leurs "alphas" ajustés pour les coûts de rééquilibrage sur la base de nos estimations des coûts de transaction pour montrer que toutes les stratégies sont soit non rentables soit perdent de l'argent, à l'exception de deux anomalies: le "prix de l'action" et la "dynamique du secteur industriel". Dans le deuxième chapitre, j'étudie comment la popularité des anomalies dans les revues scienti ques spécialisées en finance peut influer sur le rendement des stratégies basées sur ces anomalies. J'utilise le ton du résumé de la publication dans laquelle une anomalie est discutée et le facteur d'impact de la revue dans laquelle cette publication a paru pour prévoir le rendement des stratégies basées sur ces anomalies sur la période après publication. La principale conclusion est la suivante: lorsqu'une anomalie est discutée dans une publication dont le résumé a un ton positif, et qui apparaît dans une revue avec un facteur d'impact supérieur à 3 (Journal of Finance, Journal of Financial Economics, Review of Financial Studies), cette anomalie est plus susceptible d'attirer les investisseurs qui vont baser leurs stratégies sur cette anomalie et corriger ainsi la mauvaise évaluation des actions. Le troisième chapitre (co-écrit avec Vasia Panousi) propose une mesure de l'activité bancaire parallèle des entreprises opérant dans le secteur financier aux États-Unis. À cette fi n, nous utilisons l'analyse de données textuelles en extrayant des informations des rapports annuels et trimestriels des entreprises. On constate que l'activité bancaire parallèle était plus élevée pour les "Institutions de dépôt", les "Institutions qui ne prennent pas de dépôt" et le secteur "Immobilier" avant 2008. Mais après 2008, l'activité bancaire parallèle a considérablement baissé pour toutes les fi rmes opérant dans le secteur financier sauf les "Institutions non dépositaires". Notre indice du système bancaire parallèle satisfait certains faits économiques concernant le système bancaire parallèle, en particulier le fait que les politiques monétaires restrictives contribuent à l'expansion du système bancaire parallèle. Nous montrons également avec notre indice que, lorsque l'activité bancaire parallèle des 100 plus grandes banques augmente, les taux de délinquance sur les prêts accordés par ces banques augmentent également. L'inverse est observé avec l'indice bancaire traditionnel: une augmentation de l'activité bancaire traditionnelle des 100 plus grandes banques diminue le taux de délinquance. / This thesis has three chapters in which I study transaction costs, anomalies and shadow banking activities. In the first chapter (co-authored with René Garcia) a novel way of estimating transaction costs is proposed. Transaction costs have declined over time but they can increase considerably when funding liquidity becomes scarce, investors' fears spike or other frictions limit arbitrage. We estimate bid-ask spreads of thousands of firms at a daily frequency and put forward these large movements for several of these episodes in the last 30 years. The transaction cost of three-quarters of the firms is significantly impacted by funding liquidity and increases on average by 24%. While small firms and high volatility firms have larger transaction costs, the relative increase in transaction costs in crisis times is more pronounced in large firms and low-volatility firms. The gap between the respective transaction costs of these high- and low-quality groups also increases when financial conditions deteriorate, which provides evidence of flight to quality. We build anomaly-based long-short portfolios and estimate their alphas adjusted for rebalancing costs based on our security-level transaction cost estimates to show that all strategies are either unprofitable or lose money, except for price per share and industry momentum. In the second chapter I study how the popularity of anomalies in peer-reviewed finance journals can influence the returns on these anomalies. I use the tone of the abstract of the publication in which an anomaly is discussed and the impact factor of the journal in which this publication appears to forecast the post-publication return of strategies based on the anomaly. The main finding is the following: when an anomaly is discussed in a positive tone publication that appears in a journal with an impact factor higher than 3 (Journal of Finance, Journal of Financial Economics, Review of Financial Studies), this anomaly is more likely to attract investors that are going to arbitrage away the mispricing. The third chapter (co-authored with Vasia Panousi) proposes a measure of the shadow banking activity of firms operating in the financial industry in the United States. For this purpose we use textual data analysis by extracting information from annual and quarterly reports of firms. We find that the shadow banking activity was higher for the “Depository Institutions", “Non depository Institutions" and the “Real estate" before 2008. But after 2008, the shadow banking activity dropped considerably for all the financial companies except for the “Non depository Institutions". Our shadow banking index satisfies some economic facts about the shadow banking, especially the fact that contractionary monetary policies contribute to expand shadow banking. We also show with our index that, when the shadow banking activity of the 100 biggest banks increases, the delinquency rates on the loans that these banks give also increases. The opposite is observed with the traditional banking index: an increase of the traditional banking activity of the 100 biggest banks decreases the delinquency rate.
25

Addressing Challenges in Graphical Models: MAP estimation, Evidence, Non-Normality, and Subject-Specific Inference

Sagar K N Ksheera (15295831) 17 April 2023 (has links)
<p>Graphs are a natural choice for understanding the associations between variables, and assuming a probabilistic embedding for the graph structure leads to a variety of graphical models that enable us to understand these associations even further. In the realm of high-dimensional data, where the number of associations between interacting variables is far greater than the available number of data points, the goal is to infer a sparse graph. In this thesis, we make contributions in the domain of Bayesian graphical models, where our prior belief on the graph structure, encoded via uncertainty on the model parameters, enables the estimation of sparse graphs.</p> <p><br></p> <p>We begin with the Gaussian Graphical Model (GGM) in Chapter 2, one of the simplest and most famous graphical models, where the joint distribution of interacting variables is assumed to be Gaussian. In GGMs, the conditional independence among variables is encoded in the inverse of the covariance matrix, also known as the precision matrix. Under a Bayesian framework, we propose a novel prior--penalty dual called the `graphical horseshoe-like' prior and penalty, to estimate precision matrix. We also establish the posterior convergence of the precision matrix estimate and the frequentist consistency of the maximum a posteriori (MAP) estimator.</p> <p><br></p> <p>In Chapter 3, we develop a general framework based on local linear approximation for MAP estimation of the precision matrix in GGMs. This general framework holds true for any graphical prior, where the element-wise priors can be written as a Laplace scale mixture. As an application of the framework, we perform MAP estimation of the precision matrix under the graphical horseshoe penalty.</p> <p><br></p> <p>In Chapter 4, we focus on graphical models where the joint distribution of interacting variables cannot be assumed Gaussian. Motivated by the quantile graphical models, where the Gaussian likelihood assumption is relaxed, we draw inspiration from the domain of precision medicine, where personalized inference is crucial to tailor individual-specific treatment plans. With an aim to infer Directed Acyclic Graphs (DAGs), we propose a novel quantile DAG learning framework, where the DAGs depend on individual-specific covariates, making personalized inference possible. We demonstrate the potential of this framework in the regime of precision medicine by applying it to infer protein-protein interaction networks in Lung adenocarcinoma and Lung squamous cell carcinoma.</p> <p><br></p> <p>Finally, we conclude this thesis in Chapter 5, by developing a novel framework to compute the marginal likelihood in a GGM, addressing a longstanding open problem. Under this framework, we can compute the marginal likelihood for a broad class of priors on the precision matrix, where the element-wise priors on the diagonal entries can be written as gamma or scale mixtures of gamma random variables and those on the off-diagonal terms can be represented as normal or scale mixtures of normal. This result paves new roads for model selection using Bayes factors and tuning of prior hyper-parameters.</p>

Page generated in 0.0622 seconds