• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 386
  • 69
  • 43
  • 43
  • 43
  • 43
  • 43
  • 41
  • 17
  • 2
  • 2
  • 1
  • Tagged with
  • 602
  • 602
  • 118
  • 109
  • 88
  • 71
  • 46
  • 46
  • 45
  • 45
  • 43
  • 41
  • 39
  • 36
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Three essays in asset pricing

Karoui, Mehdi January 2013 (has links)
This thesis consists of three essays that explore alternative approaches to extracting information from option data, and, along somewhat different lines, examine the channels through which liquidity is priced in equity options.The first essay proposes a novel approach to extracting option-implied equity premia, and empirically examines the information content of these risk premia for forecasting the stock market return. Our approach does not require specifying the functional form of the pricing kernel, and does not impose any restrictions on investors' preferences. We only assume the existence of put and call options which complete the market, and show that the implied equity premium can be inferred from expected excess returns on a portfolio of options. An empirical investigation of S&P 500 index options yields the following conclusions: (i) the implied equity premium predicts stock market returns; (ii) the implied equity premium consistently outperforms variables commonly used in the forecasting literature both in- and out-of-sample; (iii) the implied equity premium is positively related to future returns and negatively related to current returns, as theoretically expected.The second essay studies the effect of illiquidity on equity option returns. Illiquidity is well-known to be a significant determinant of stock and bond returns. We are the first to report on illiquidity premia in equity option markets using a large cross-section of firms. An increase in option illiquidity decreases the current option price and predicts higher expected delta-hedged option returns. This effect is statistically and economically significant, and it is consistent with existing evidence that market makers in the equity options market hold net long positions. The illiquidity premium is robust across puts and calls, across maturities and moneyness, as well as across different empirical approaches. It is also robust when controlling for various firm-specific variables including a standard measure of illiquidity of the underlying stock. For long term options, we find evidence of a liquidity risk factor. In the third essay, we demonstrate that in multifactor asset pricing models, prices of risk for factors that are nonlinear functions of the market return can be readily obtained using data on index returns and index options. We apply this general result to the measurement of the conditional price of coskewness and cokurtosis risk. The price of coskewness risk corresponds to the spread between the physical and the risk-neutral second moments, and the price of cokurtosis risk corresponds to the spread between the physical and the risk-neutral third moments. Estimates of these prices of risk have the expected sign, and they lead to reasonable risk premia. An out-of-sample analysis of factor models with coskewness and cokurtosis risk indicates that the new estimates of the price of risk improve the models. performance. The models also robustly outperform competitors such as the CAPM and the Fama-French model. / Cette thèse comporte trois essais qui explorent de nouvelles méthodes pour l'extraction d'informations à partir des données sur les options. Nous examinons aussi les effets de la liquidité dans le marché des options.Dans le premier essai, nous proposons une nouvelle approche pour extraire la prime de risque sur les actions (equity premium) implicite dans les options. Par la suite, nous examinons la capacité de cette prime de risque implicite à prédire les rendements du marché. Notre approche ne nécessite pas la connaissance du facteur d'actualisation stochastique (pricing kernel) et n'impose aucunes restrictions sur les préférences des investisseurs. Nous supposons l'existence d'un continuum d'options call et put et démontrons que la prime de risque implicite peut être déduite à partir des rendements espérés d'un portefeuille d'options. Une étude empirique des options sur l'indice S&P 500 révèle les conclusions suivantes : (i) la prime de risque implicite prédit les rendements du marché ; (ii) la performance de la prime de risque implicite est meilleure que celles des variables communément utilisées dans la littérature ; (iii) la prime de risque implicite est positivement liée aux rendements futures et négativement liée aux rendements actuels, comme la théorie le prédit.Dans le deuxième essai, nous examinons les effets de l'illiquidité sur les rendements des options. Il est bien connu que l'illiquidité est un déterminant important des rendements des actions et des obligations. Nous démontrons l'existence de primes de risque liées à l'illiquidité dans le marché des options sur actions. Une augmentation de l'illiquidité de l'option entraine une diminution du prix actuel de l'option et une augmentation du rendement espéré de la stratégie de couverture delta-neutre (delta-hedged). Ce résultat est statistiquement et économiquement significatif et cohérent avec le fait que les pourvoyeurs de liquidité ont une position nette longue dans le marché des options. Ce résultat est robuste aussi bien pour les options call que les options put, et ceci en utilisant différentes approches empiriques. Il est aussi robuste au contrôle de différentes variables spécifiques aux compagnies, et aussi à l'illiquidité de l'actif sous-jacent. Pour les options à long terme, nous démontrons l'existence d.une prime de risque liée à un facteur de liquidité.Dans le troisième essai, nous démontrons que dans les modèles multifactoriels de tarification des actifs, les prix du risque liés aux facteurs qui sont une fonction non linéaire du rendement du marché peuvent être directement obtenus à partir des options sur indice. L'application de ce résultat général au cas du risque de « co-skewness » montre que le prix de ce risque est égal à la différence entre le deuxième moment sous la mesure physique et le deuxième moment sous la mesure risque-neutre. De la même manière, on démontre que le prix du risque de « co-kurtosis » est égal à la différence entre le troisième moment sous la mesure physique et le troisième moment sous la mesure risque-neutre. Le signe des estimés de ces prix de risque sont cohérents avec la théorie et la valeur des primes de risque obtenues en utilisant ces prix de risque est raisonnable. Mais aussi, en utilisant ces prix de risque, nous améliorons la performance des modèles qui incorporent les risques de « co-skewness » et de « co-kurtosis ». La performance de ces modèles est aussi supérieure à celle du CAPM ou encore celle du modèle de Fama-French.
72

Essays on causality and volatility in econometrics with financial applications

Zhang, Hui Jun January 2013 (has links)
This thesis makes contributions to the statistical analysis of causality and volatility in econometrics. It consists of five essays, theoretical and empirical. In the first one, we study how to characterize and measure multi-horizon second-order causality. The second and third essays propose linear estimation methods for univariate and multivariate weak GARCH models. In the fourth essay, we use multi-horizon causality measures to study the causal relationships between commodity prices and exchange rates with high-frequency data. In the fifth essay, we evaluate the historical evolution of volatility forecast skill.Given the increasingly important role of volatility forecasting in financial studies, a number of authors have proposed to extend the notion of Granger causality to study the dynamic cobehavior of volatilities. In the first essay, we propose a general theory of second-order causality between random vectors at different horizons, allowing for the presence of auxiliary variables, in terms of the predictability of conditional variance. We establish various properties of the causality structures so defined. Furthermore, we propose nonparametric and parametric measures of second-order causality at a given horizon. We suggest a simulation-based method to evaluate the measures in the context of stationary VAR-MGARCH. The asymptotic validity of bootstrap confidence intervals is demonstrated. Finally, we apply the proposed measures of second-order causality to study volatility spillover and contagion across financial markets in the U.S., the U.K. and Japan, for the period of 2000-2010.It is well known that the quasi-maximum likelihood (QML) estimator is consistent and asymptotically normal for (semi-)strong GARCH models. However, when estimating a weak GARCH model, the QML estimator can be inconsistent due to the misspecification of conditional variance. The nonlinear least squares (NLS) estimation is consistent and asymptotically normal for weak GARCH models, but requires a complicated nonlinear optimization. In the second essay, we suggest a linear estimation method, which is shown to be consistent and asymptotically normal for weak GARCH models. Simulation results for weak GARCH models indicate that, the linear estimation method outperforms both QML and NLS for parameter estimation, and is comparable to the NLS, and better than QML for out-of-sample forecasts.Similar issues show up when QML and NLS are used for weak multivariate GARCH (MGARCH) models. In the third essay, we propose a linear estimation method for weak MGARCH models. The asymptotic properties of this linear estimator are established. Simulations for weak MGARCH models show that our linear estimation method outperforms both QML and NLS for the parameter estimation, and the three methods perform similarly in out-of-sample forecasting experiments. Most importantly, the proposed linear estimation is much less computationally complex than QML and NLS. In the fourth essay, we study the causal relationship between commodity prices and exchange rates. Existing studies using quarterly data and noncausality tests only at horizon 1 do not indicate a clear direction of causality from commodity prices to exchange rates. In contrast, by considering multi-horizon causality measures using the high-frequency data (daily and 5-minute) from three typical commodity economies, we find that causality running from commodity prices to exchange rates is stronger than that in the opposite direction up to multiple horizons, after accounting for ‘dollar effects'.In the fifth essay, we apply the concept of forecast skill to evaluate the historical evolution of volatility forecasting, using the data from S&P 500 composite index over the period of 1983-2009. We find that models of conditional volatility do yield improvements in forecasting, but the historical evolution of volatility forecast skill does not exhibit a clear upward trend. / Cette thèse porte sur l'analyse statistique de la causalité et la volatilité en économétrie. Elle consiste en cinq essais, tant théoriques qu'empiriques. Dans le premier, nous étudions comment caractériser et mesurer la causalité de second-ordre sur plusieurs horizons. Le second et le troisième essais proposent les méthodes d'estimation linéaires pour les modèles GARCH univariés et multivariés faibles. Dans le quatrième essai, nous utilisons des mesures de causalité sur plusieurs horizons afin d'étudier la causalité entre les prix des marchandises et les taux de change dans les données à haute fréquence. Dans le cinquième essai, nous évaluons l'évolution historique de l'habileté prévisionnelle de volatilité.Dans le premier essai, nous proposons une théorie plus générale de la causalité de second-ordre entre vecteurs aléatoires à different horizons, en permettant la présence de variables auxiliaires, en termes de prévisibilité de la variance conditionnelle. Nous établissons diverses propriétés des structures de causalité ainsi définies. De plus, nous proposons des mesure non-paramétriques etparamétriques de causalité de second-ordre. Nous utilisons des méthodes basées sur la simulation pour évaluer les mesures dans le contexte des modèles VAR-MGARCH. La validité asymptotique des intervalles de confiance par bootstrap est démontrée. Finalement, nous appliquons les mesures de causalité de second-ordre pour étudier les effets de débordement de volatilité et la contagion sur les marchés financiers aux États-Unis, au Royaume-Uni et au Japon, durant la période 2000-2010.Il est bien connu que l'estimateur du quasi-maximum de vraisemblance (QMVE) est convergent et asymptotiquement normal pour les modèles GARCH forts ou semi-forts. Cependant, lorsqu'on estime un modèle GARCH faible, QMVE peut ne pas converger à cause d'erreurs de spécification sur les deux premiers moments. L'estimation par moindres carrés non linéaires (MCNLE) est convergent pour les modèles GARCH faibles, mais requiert une optimisation non linéaire compliquée. Nous proposons une méthode d'estimation linéaire, que est convergent et asymptotiquement normal pour les modèles GARCH faible. Les résultats des simulations démontrent que la méthode linéaire est supérieure aux QMVE et MCNLE pour l'estimation, est comparable à MCNLE, et supérieure à QMVE pour la prévision hors échantillon.Des problèmes similaires apparaissent lorsque les QMVE et MCNLE sont utilisés pour estimer des modèles GARCH multivariés (MGARCH) faibles. Dans le troisième essai, nous proposons une méthode d'estimation linéaire pour les modèles MGARCH faibles. Les propriétés asymptotiques de cet estimateur linéaire sont établies. Les simulations montrent que les trois méthodes sont équivalentes pour la prévision hors échantillon.Dans le quatrième essai, nous étudions la relation causale entre les prix de marchandises et les taux de change. Les etudes existantes sont basées sur des données trimestrielles et les tests de non causalité à un horizon n'ont pas confirmé les attentes intuitives sur une direction claire de la causalité allant des prix de marchandises vers les taux de change. Au contraire, en considérant les mesures de causalié sur plusieurs horizons et en utisant les données à haute fréquence à partir de trios economies typiques de marchandises, nous trouvons que la causalité allant des prix des marchandises aux taux de change est plus forte que dans la direction opposée jusqu'à plusieurs horizons, après avoir contrôlé ‘dollar effects'.Dans le cinquième essai, nous appliquons le concept d'habileté prévisionnelle pour évaluer l'évolution historique des prévisions de volatilité, sur l'indice S&P 500 sur une période (1983-2009). Nous trouvons que les modèles de volatilité conditionnelle permettent d'améliorer la prévision de la volatilité, mais il n'y a pas de tendance à la hausse dans la qualité des prévisions.
73

Essays on financial analysis: Capital structure, dynamic dependence and extreme loss modeling

Wang, Xin January 2008 (has links)
This dissertation contains three essays concerning two broad areas, namely, optimal capital structure and risky assets modeling. In the first paper, we study corporate debt values, capital structure, and the term structure of interest rates in a unified framework. We employ numerical techniques to compute the firm's optimal capital structure and the value of its long-term risky debt with call option embedded and yield spreads when the value of the firm's unleveraged assets and the instantaneous default-free interest rate are risk factors. Debt and leveraged firm value are thus explicitly linked to properties of the firm's unleveraged assets, the term structure of default-free interest rates, taxes, bankruptcy costs, payout rates, and bond covenants. The results clarify the relationship between a firm's capital structure and movements in the term structure and other important aspects of the capital structure decision. In the second chapter, we propose a dynamic copula modeling framework that allows copula association parameters to change with time and macroeconomic variables. We find empirical evidence that nominal interest rate and price index for traded goods differentials between two countries have significant impact on the co-movement of foreign exchange rates. Our Pearson-type goodness-of-fit test has the power to reject constant and time-varying copula modeling approaches at the 95% confidence level. In the third chapter, a new method for solving sample size problem in probabilistic risk assessment has been developed. We propose the use of Bayesian power prior distributions to improve extreme value theory and provide reliable estimates of Value-at-Risk (VaR) and expected shortfall. The Bayesian Monte Carlo Markov chain computational scheme with power prior distributions allows us to properly incorporate historical data and borrow strength and information from related sources to current study.
74

Strategic behaviors in financial markets and applications of the market discipline mechanism

Nal, Osman January 2008 (has links)
This dissertation provides theoretical and empirical support for the mechanism of market discipline as an alternative channel complementing supervisory efforts of prudential regulation. The model introduced in the first part is based on a novel risk-return technology that summarizes the lending opportunities of the financial intermediaries. The geometric properties and assumptions underlying the risk-return function are also studied. The main result of the model asserts that a banking institution is rewarded by revealing more information about its portfolio structure and penalized by preventing information pertaining to its asset portfolio to generate ambiguity and uncertainty about its condition. Therefore these results encourage bank managers for meaningful disclosure of bank data in a timely fashion. In fact to do so is in the best interest of the bank itself. The model asserts also that relying solely on capital requirements might be insufficient for establishing safety and soundness of the banking system. Besides, policies like deposit safety nets and "too big to fail" protection severely undermine the effective functioning of market discipline. The dissertation gathers summary statistics related to subordinated debt issued by the 100 largest U.S. banks from 1984 until 2007. U.S. banks released larger amounts of these types of securities over the recent years in compliance with greater efforts to enhance market discipline over banks in the U.S. The second part of the dissertation compares the significance of market discipline in Turkey before and after the 2001 financial crisis. Unlike the literature and past empirical studies about Turkish banking, estimation results from a 3SLS instrumental variable regression is reported. The results support that market discipline is stronger following the crisis. The interaction between deposit insurance and market discipline is also analyzed for the period between the 3rd quarter of 1997 and the first quarter of 2007. Applications of the "too-big-to-fail" protection during and following financial crises are justified from the dataset and regression estimation results. This type of coverage for large banks certainly diminishes the efficiency of the market discipline mechanism in Turkey in the post crisis period.
75

Essays on time series: Time change and applications to testing, estimation and inference in continuous time models

Vasudev, Rahul January 2007 (has links)
In Chapter 1, I develop a test for the martingale hypothesis using the fact that a continuous martingale is time-deformed browninan motion, where the deforming process is quadratic variation of the martingale. Sampling a martingale at equal increases in quadratic variation and taking first differences, we may obtain variables that are identically, independently distributed as normal. I propose tests that involve thus transforming a sample and testing for the martinagale hypothesis by using the distance between the empirical distribution of the transformed variables and the standard normal distribution using Kolmogorov-Smirnov and Cramer-von Mises statistics. Asymptotics in this setting involve sampling the process more frequently over a given sampling horizon in addition to sampling over longer horizons. Simulations show that the size and power performance of the tests is rather satisfactory. The test is employed on a variety of financial futures to test for a risk premium. In Chapter 2, I develop a method of estimating drift terms in diffusion models that involves transforming the discretely observed sample and performing a least squares regression. This method allows for the estimation of a parametrically specified drift term when the diffusion term is unspecified. The procedure involves sampling data at equal increases in estimated quadratic variation and taking first differences. The transformed variables can be written as the sum of two quantities, a function of the drift parameters and, a regression error that forms a martingale difference sequence and is homoscedastic in the limit when the sampling interval is small. Thus the procedure relies on asymptotics of a small sampling interval as well as a large sample. Simulations show that the procedure outperforms ordinary least squares for nonstationary processes. In Chapter 3, I revisit the testing problem studied in Chapter 1 and consider a test based on second moments instead of distrbutional distance. Simulations show that for mean-reverting processes, the test is more powerful than the tests of Chapter 1. Further, I present a time-change result for discrete martingales, and show that discrete martingales sampled at equal increases in quadratic variation and differenced are homoscedastic to an error.
76

The experience of a dual exchange market with a simultaneous unofficial market in El Salvador

Alas Rodriguez, Nelly Carolina January 1987 (has links)
Because of economic and non-economic factors, the situation of the external sector in El Salvador has reached critical levels. Under persistent trade deficits, capital flight, and a continuous reduction in international reserves, the authorities instituted in August 1982, for the first time, a dual exchange rate system: an official market where the value of the dollar remained the same, and a parallel market where the exchange rate was set by the commercial banks. Among the goals of this innovation were a reduction in imports, an increase in foreign exchange receipts, a restoration of balance of payments equilibrium, and the avoidance of a sudden rise in prices. We analyze the overvaluation of the exchange rate, evolution of exports and imports, and the fiscal revenues derived from transactions in the parallel system. An econometric model measures the impact of overvalued exchange rates on imports and exports, and the effects of the parallel market on the price level. (Abstract shortened with permission of author.)
77

A time series analysis of the Japanese yen

Kwon, Jae-Jung January 1988 (has links)
This paper sought to address the question as to whether the exchange rate can be forecasted more accurately by a monetary model of exchange rate determination or the random walk in the case of the Japan-U.S. exchange rate. The evidence of Meese and Rogoff (1983) on the out-of-sample forecasting performance of structural exchange rate models in comparison to the random walk model portrays a disappointing picture of structural models. I re-considered the issue for the Japanese yen for a more recent period. Besides out-of-sample evidence, within-sample evidence was also examined. The recent work of Phillips and Perron was employed so as to verify that the exchange rate series is well approximated by a random walk model without drift but with time dependent heteroscedasticity. Having established this benchmark, structural monetary models are constructed to see whether one can obtain better within-sample and/or out-of-sample results. It appeared that the random walk can be beaten.
78

Higher-order spectral based tests of Gaussianity, linearity, and stationarity in stock returns

Chantachaimongkol, Sairung January 1992 (has links)
This paper presents empirical examinations of three important aspects of stock returns: Gaussianity, linearity, and stationarity by applying time series tests based upon the higher-order spectra. If the stationary time series is Gaussian, the second order spectrum contains all the useful information present in the series. If the series is non-Gaussian, the second order spectrum will not adequately characterize the series. Therefore, it is necessary to consider higher order spectral analysis. The tests are applied using daily stock market returns from Taiwan and Korea, weekly stock returns of five Thai companies, weekly stock market returns from Thailand, and lastly trade-by-trade stock returns of fifteen U.S. companies. We find that stationarity is rejected for trade-by-trade unaliased returns, and the aliasing problem should be considered more seriously when daily or weekly data is used in time series applications.
79

INTERINDUSTRY PRICE EFFECTS OF THE WELLHEAD TAX ON CRUDE OIL

GOMEZ-RIVAS, ALBERTO January 1980 (has links)
The work deals with the problem of changes in the relative prices of industrial outputs due to the imposition of a tax. Furthermore, the imposition of a tax induces changes in all of the components of the input-output table. The magnitude of the changes is a function of the degree of forward shifting of the tax under consideration. Although the analysis can be applied to any tax with due consideration to its shifting characteristics, the wellhead tax was selected as an illustrative example for several reasons. The tax has been proposed as an alternative to the existing system of controls and entitlements imposed on the production of crude oil in the United States after the oil embargo of 1974. The wellhead tax is equivalent to the tax on profits when the elasticity of supply of crude oil is zero. The tax on profits in another alternative which has been proposed to replace the system of controls and entitlements. The input-output tables of the U.S. Department of Commerce were analyzed in regard to their sources of data and methods of construction. From this analysis the conclusion was reached that the only possible interpretation of the tables is in "value" terms. That is, dollars for the transactions table and dollars per dollar for the direct requirements table. Three philosophically different methods were developed to analyze the price effects and to recompute the direct requirements table. The three methods yield identical results. The treatment of direct and indirect taxes in the American tables does not agree with accepted concepts of shifting in public finance. Namely, in the tables, indirect taxes are fully shifted forward while direct taxes are not shifted. In this work, simulations were run assuming different percentages of forward shifting for the wellhead tax. Comparison of the results, with analysis by Robert E. Hall, indicates that his study is equivalent to a sixty percent forward shifting of the tax. The analysis of the American input-output tables indicates that studies of tax incidence assuming constant input-output coefficients, such as the study by Henry Aaron, are logically inconsistent for any degree of forward shifting other than zero.
80

THE ECONOMICS OF URBAN STORMWATER MANAGEMENT

MAHER, MICHAEL DAVIS January 1980 (has links)
Urban flooding and water quality problems are due to both natural phenomena and externalities related to urbanization. Development of upstream land increases the rate and amount of stormwater runoff from these areas. The consequences of increased upstream runoff are borne by downstream residents in the form of increased flooding and water pollution. Mitigation efforts should include upstream externality controls, e.g., runoff detention basins, as well as more traditional flood control measures aimed at protecting residents at the site of damage. A static constrained maximization model is developed to determine efficient trade-offs among cost of abatement by "polluters," cost of preventive measures taken by victims, and damages to victims from untreated externalities. It is shown that abatement generates a public good while prevention generally produces benefits only for the victim undertaking the action. The consequence of this fundamental difference between abatement and prevention for efficient externality policy is explored. Who should pay for abatement and prevention is discussed in terms of efficiency and equity. Because flooding in urbanizing areas is inherently a growing and irreversible externality problem, a dynamic analytical framework is also developed. Long-lived capital, e.g., buildings and streets, severely limit a completed development's ability to alter its runoff flows. The irreversibility of development makes stormwater externalities similar in nature to accumulating pollutants, e.g., mercury. Current emissions of these pollutants are controllable; the stocks accumulated from past emissions are not. Likewise, the flows of runoff from completed developments are (economically) unmodifiable. Only externality additions to existing runoff flows that will be generated by subdivisions yet to be constructed are controllable. Optimal and second best policies for controlling stormwater externalities, stock pollutants, and conventional pollutants are compared to highlight differences and similarities among these externalities. It is shown that for a runoff control policy to be efficient: (1) because of irreversibility, runoff controls should be installed during a subdivision's construction. (2) The degree of a subdivision's detention of runoff should be based on the present value of damages that its runoff, in conjunction with other developments' runoff, will generate in the present and future. (3) The level of detention should reflect the location and timing of a development. Although runoff flows are irreversible, channelization can expand a stream's carrying capacity and reduce flooding without affecting runoff flows. However, because of economies to scale and the irreversible capital used in channelization, stream channels can not continually be enlarged to control ever increasing runoff flows. The problems of the timing and sizing of irreversible upstream and downstream controls are explored in depth. It is shown that (1) it is likely to be cost-effective to expand channel capacity to control some but not all future increases in runoff, (2) some detention is likely to be efficient even in periods of excess channel capacity if in future periods runoff flows will increase to such an extent that irreversible "congestion" of the channel will occur, and (3) the probabilistic nature of rainfall and hence, runoff has significant policy implications. An important policy guideline is that watershed-wide stormwater management is necessary since uncoordinated community by community action cannot simultaneously implement runoff controls upstream and flood controls downstream. The Federal Flood Insurance program, other federal flood control policies, and stormwater management in Houston, Texas are analyzed.

Page generated in 1.2913 seconds