1 |
Asymptotic Methods for Pricing European Option in a Market Model With Two Stochastic VolatilitiesCanhanga, Betuel January 2016 (has links)
Modern financial engineering is a part of applied mathematics that studies market models. Each model is characterized by several parameters. Some of them are familiar to a wide audience, for example, the price of a risky security, or the risk free interest rate. Other parameters are less known, for example, the volatility of the security. This parameter determines the rate of change of security prices and is determined by several factors. For example, during the periods of stable economic growth the prices are changing slowly, and the volatility is small. During the crisis periods, the volatility significantly increases. Classical market models, in particular, the celebrated Nobel Prize awarded Black–Scholes–Merton model (1973), suppose that the volatility remains constant during the lifetime of a financial instrument. Nowadays, in most cases, this assumption cannot adequately describe reality. We consider a model where both the security price and the volatility are described by random functions of time, or stochastic processes. Moreover, the volatility process is modelled as a sum of two independent stochastic processes. Both of them are mean reverting in the sense that they randomly oscillate around their average values and never escape neither to very small nor to very big values. One is changing slowly and describes low frequency, for example, seasonal effects, another is changing fast and describes various high frequency effects. We formulate the model in the form of a system of a special kind of equations called stochastic differential equations. Our system includes three stochastic processes, four independent factors, and depends on two small parameters. We calculate the price of a particular financial instrument called European call option. This financial contract gives its holder the right (but not the obligation) to buy a predefined number of units of the risky security on a predefined date and pay a predefined price. To solve this problem, we use the classical result of Feynman (1948) and Kac (1949). The price of the instrument is the solution to another kind of problem called boundary value problem for a partial differential equation. The resulting equation cannot be solved analytically. Instead we represent the solution in the form of an expansion in the integer and half-integer powers of the two small parameters mentioned above. We calculate the coefficients of the expansion up to the second order, find their financial sense, perform numerical studies, and validate our results by comparing them to known verified models from the literature. The results of our investigation can be used by both financial institutions and individual investors for optimization of their incomes.
|
2 |
Online Learning of Non-Stationary Networks, with Application to Financial DataHongo, Yasunori January 2012 (has links)
<p>In this paper, we propose a new learning algorithm for non-stationary Dynamic Bayesian Networks is proposed. Although a number of effective learning algorithms for non-stationary DBNs have previously been proposed and applied in Signal Pro- cessing and Computational Biology, those algorithms are based on batch learning algorithms that cannot be applied to online time-series data. Therefore, we propose a learning algorithm based on a Particle Filtering approach so that we can apply that algorithm to online time-series data. To evaluate our algorithm, we apply it to the simulated data set and the real-world financial data set. The result on the simulated data set shows that our algorithm performs accurately makes estimation and detects change. The result applying our algorithm to the real-world financial data set shows several features, which are suggested in previous research that also implies the effectiveness of our algorithm.</p> / Thesis
|
3 |
Analysing potato price volatility in South AfricaMoabelo, Julith Tsebisi January 2019 (has links)
Thesis ( M.Sc.(Agricultural Economics)) --University of Limpopo, 2019. / Potato is perceived as an excellent crop in the fight against hunger and poverty. The recent high potato price in South Africa has pushed the vegetable out of reach of the poorest of the poor. The study attempts to analyse potato price volatility in South Africa and furthermore assess how various factors were responsible for the recent potato price volatility. Quarterly data for potato price, number of hectares planted, rainfall and temperature levels from 2006q1 to 2017q4 was collected from various sources and were used for analysis. The total observation of 48.
The volatility in the series was determined by performing ARCH/GARCH model. GARCH model indicates an evidence of GARCH effect in the series, meaning that GARCH model influences potato price volatility in South Africa. The Johansen cointegration used both trace and eigenvalue to test the existence of a long run relationship between potato price and various variables. The cointegration results were positive indicating that there exists long run relationship amongst variables. The study further used Johansen cointegration as well as standard error to determine the number of cointegrating variables in the long run. The results indicated that the number of hectares planted and rainfall level have significant relationship with potato price. Wald tests was used to check whether the past values of number of hectares planted and rainfall level influenced the current value of potato price. The Walt test results concluded that there is no evidence of short run causality running from number of hectares planted and rainfall level to potato price. In the study, ECM model was used to forecast the potato price fluctuation in South Africa.
The study recommends that farmers need to engage in contract market so as to minimize the risk of potato price volatility. The Department of Agriculture should forecast agricultural commodities price volatility and make information accessible to the farmers so that they are able to adopt strategies that will assist them to overcome crisis.
|
4 |
Essays on housing and monetary policyNam, Min-Ho January 2013 (has links)
This thesis, motivated by my reflections about the failings of monetary policy implementation as a cause of the sub-prime crisis, attempts to answer the following inquiries: (i) whether interest rates have played a major role in generating the house price fluctuations in the U.S., (ii) what are the effects of accommodative monetary policy on the economy given banks' excessive risk-taking, and (iii) whether an optimal monetary policy rule can be found for curbing credit-driven economic volatilities in the model economy with unconventional transmission channels operating. By using a decomposition technique and regression analysis, it can be shown that short-term interest rates exert the most potent influence on the evolution of the volatile components of housing prices. One possible explanation for this is that low policy rates for a prolonged period tend to encourage bankers to take on more risk in lending. This transmission channel, labelled as the risk-taking channel, accounts for the gap to some extent between the forecast and the actual impact of monetary policy on the housing market and the overall economy. A looser monetary policy stance can also shift the preference of economic agents toward housing as theoretically and empirically corroborated in the context of choice between durable and nondurable goods. This transmission route is termed the preference channel. If these two channels are operative in the economy, policy makers need to react aggressively to rapid credit growth in order to stabilize the paths of housing prices and output. These findings provide meaningful implications for monetary policy implementation. First of all, central bankers should strive to identify in a timely fashion newly emerging and state-dependent transmission channels of monetary policy, and accurately assess the impact of policy decisions transmitted through these channels. Secondly, the intervention of central banks in the credit or housing market by adjusting policy rates can be optimal, relative to inaction, in circumstances where banks' risk-taking and the preference for housing are overly exuberant.
|
5 |
Um modelo coerente de gerenciamento de risco de liquidez para o contexto brasileiroMastella, Mauro January 2008 (has links)
O objetivo desta dissertação é desenvolver um modelo de gestão de risco de liquidez que flexibilize as principais simplificações geralmente realizadas pelas instituições financeiras na aplicação de testes de estresse para gerenciamento do risco de liquidez. Assim, esta pesquisa consiste em estimar um fluxo de caixa unificado de uma instituição sob diferentes cenários econômicos, testando se as volatilidades implícitas das opções são um bom indicador de mudanças significativas no mercado de capitais brasileiro, como quebras de correlações históricas e avaliando o efeito da utilização de um deslocamento não paralelo da Estrutura a Termo da Taxa de Juros na aplicação de um teste de estresse. Para alcançar os objetivos propostos, desenvolveu-se uma pesquisa exploratória através do método de estudo de casos. Para isto, este trabalho utilizou um modelo de análise fatorial para avaliar os movimentos da estrutura a termo da taxa de juros brasileira, além de uma ferramenta prática para capturar as informações fornecidas pela volatilidade implícita e uma matriz de correlação entre as variáveis do cenário econômico utilizado para o teste de estresse. Concluiu-se que: i) a realização de um ajuste no cenário econômico em função da matriz de correlação de suas variáveis contribui para a coerência desse teste com seu contexto econômico, sensibilizando significativamente os valores dos fluxos de caixa futuros das instituições financeiras; ii) os resultados da análise fatorial indicam que a aplicação de um deslocamento não paralelo da ETTJ tem pouca importância prática para a melhoria da coerência dos testes de estresse, em função da grande concentração de poder explicativo da sua variabilidade em torno do fator nível e do baixo poder explicativo dos fatores inclinação e curvatura, principalmente nos vértices iniciais; iii) a volatilidade implícita pode ser utilizada como um proxy da volatilidade futura, apesar das suas limitações. Com a aplicação do modelo em uma instituição financeira, foi sugerido um grid de decisão a ser utilizado no gerenciamento do risco de liquidez. / The aim of this dissertation is to develop a framework for liquidity risk management that relaxes the main simplifications made by financial institutions when using stress tests to manage their liquidity risk. This research consists of an estimatate of a financial institution cash flow under different scenarios, testing if option´s implied volatilities are a good proxy for significant future changes in the brazilian capital market (like historical correlations shifts) and evaluating the use of a non-parallel yield curve shift for the stress test process. To reach the desired objectives, it was developed an exploratory research through the study case method. This work, therefore, uses a factor analysis model to evaluate the Brazilian yield curve term structure movements, a practical tool to capture the information provided by implied volatility and a correlation matrix among the scenario variables used for stress testing. The conclusions were: i) the correlation adjusted scenario adds coherence to the stress test process; ii) a non-parallel yield curve shift adds little coherence, because most of its variance can be explained by the factor level and only a few can be explained by the factors slope and curvature; iii) the option´s implied volatility can be used as a proxy for the future volatility. After the use of this model in a financial instituition cash flow, a decision grid was developed to be used in the liquidity risk management.
|
6 |
Um modelo coerente de gerenciamento de risco de liquidez para o contexto brasileiroMastella, Mauro January 2008 (has links)
O objetivo desta dissertação é desenvolver um modelo de gestão de risco de liquidez que flexibilize as principais simplificações geralmente realizadas pelas instituições financeiras na aplicação de testes de estresse para gerenciamento do risco de liquidez. Assim, esta pesquisa consiste em estimar um fluxo de caixa unificado de uma instituição sob diferentes cenários econômicos, testando se as volatilidades implícitas das opções são um bom indicador de mudanças significativas no mercado de capitais brasileiro, como quebras de correlações históricas e avaliando o efeito da utilização de um deslocamento não paralelo da Estrutura a Termo da Taxa de Juros na aplicação de um teste de estresse. Para alcançar os objetivos propostos, desenvolveu-se uma pesquisa exploratória através do método de estudo de casos. Para isto, este trabalho utilizou um modelo de análise fatorial para avaliar os movimentos da estrutura a termo da taxa de juros brasileira, além de uma ferramenta prática para capturar as informações fornecidas pela volatilidade implícita e uma matriz de correlação entre as variáveis do cenário econômico utilizado para o teste de estresse. Concluiu-se que: i) a realização de um ajuste no cenário econômico em função da matriz de correlação de suas variáveis contribui para a coerência desse teste com seu contexto econômico, sensibilizando significativamente os valores dos fluxos de caixa futuros das instituições financeiras; ii) os resultados da análise fatorial indicam que a aplicação de um deslocamento não paralelo da ETTJ tem pouca importância prática para a melhoria da coerência dos testes de estresse, em função da grande concentração de poder explicativo da sua variabilidade em torno do fator nível e do baixo poder explicativo dos fatores inclinação e curvatura, principalmente nos vértices iniciais; iii) a volatilidade implícita pode ser utilizada como um proxy da volatilidade futura, apesar das suas limitações. Com a aplicação do modelo em uma instituição financeira, foi sugerido um grid de decisão a ser utilizado no gerenciamento do risco de liquidez. / The aim of this dissertation is to develop a framework for liquidity risk management that relaxes the main simplifications made by financial institutions when using stress tests to manage their liquidity risk. This research consists of an estimatate of a financial institution cash flow under different scenarios, testing if option´s implied volatilities are a good proxy for significant future changes in the brazilian capital market (like historical correlations shifts) and evaluating the use of a non-parallel yield curve shift for the stress test process. To reach the desired objectives, it was developed an exploratory research through the study case method. This work, therefore, uses a factor analysis model to evaluate the Brazilian yield curve term structure movements, a practical tool to capture the information provided by implied volatility and a correlation matrix among the scenario variables used for stress testing. The conclusions were: i) the correlation adjusted scenario adds coherence to the stress test process; ii) a non-parallel yield curve shift adds little coherence, because most of its variance can be explained by the factor level and only a few can be explained by the factors slope and curvature; iii) the option´s implied volatility can be used as a proxy for the future volatility. After the use of this model in a financial instituition cash flow, a decision grid was developed to be used in the liquidity risk management.
|
7 |
Um modelo coerente de gerenciamento de risco de liquidez para o contexto brasileiroMastella, Mauro January 2008 (has links)
O objetivo desta dissertação é desenvolver um modelo de gestão de risco de liquidez que flexibilize as principais simplificações geralmente realizadas pelas instituições financeiras na aplicação de testes de estresse para gerenciamento do risco de liquidez. Assim, esta pesquisa consiste em estimar um fluxo de caixa unificado de uma instituição sob diferentes cenários econômicos, testando se as volatilidades implícitas das opções são um bom indicador de mudanças significativas no mercado de capitais brasileiro, como quebras de correlações históricas e avaliando o efeito da utilização de um deslocamento não paralelo da Estrutura a Termo da Taxa de Juros na aplicação de um teste de estresse. Para alcançar os objetivos propostos, desenvolveu-se uma pesquisa exploratória através do método de estudo de casos. Para isto, este trabalho utilizou um modelo de análise fatorial para avaliar os movimentos da estrutura a termo da taxa de juros brasileira, além de uma ferramenta prática para capturar as informações fornecidas pela volatilidade implícita e uma matriz de correlação entre as variáveis do cenário econômico utilizado para o teste de estresse. Concluiu-se que: i) a realização de um ajuste no cenário econômico em função da matriz de correlação de suas variáveis contribui para a coerência desse teste com seu contexto econômico, sensibilizando significativamente os valores dos fluxos de caixa futuros das instituições financeiras; ii) os resultados da análise fatorial indicam que a aplicação de um deslocamento não paralelo da ETTJ tem pouca importância prática para a melhoria da coerência dos testes de estresse, em função da grande concentração de poder explicativo da sua variabilidade em torno do fator nível e do baixo poder explicativo dos fatores inclinação e curvatura, principalmente nos vértices iniciais; iii) a volatilidade implícita pode ser utilizada como um proxy da volatilidade futura, apesar das suas limitações. Com a aplicação do modelo em uma instituição financeira, foi sugerido um grid de decisão a ser utilizado no gerenciamento do risco de liquidez. / The aim of this dissertation is to develop a framework for liquidity risk management that relaxes the main simplifications made by financial institutions when using stress tests to manage their liquidity risk. This research consists of an estimatate of a financial institution cash flow under different scenarios, testing if option´s implied volatilities are a good proxy for significant future changes in the brazilian capital market (like historical correlations shifts) and evaluating the use of a non-parallel yield curve shift for the stress test process. To reach the desired objectives, it was developed an exploratory research through the study case method. This work, therefore, uses a factor analysis model to evaluate the Brazilian yield curve term structure movements, a practical tool to capture the information provided by implied volatility and a correlation matrix among the scenario variables used for stress testing. The conclusions were: i) the correlation adjusted scenario adds coherence to the stress test process; ii) a non-parallel yield curve shift adds little coherence, because most of its variance can be explained by the factor level and only a few can be explained by the factors slope and curvature; iii) the option´s implied volatility can be used as a proxy for the future volatility. After the use of this model in a financial instituition cash flow, a decision grid was developed to be used in the liquidity risk management.
|
8 |
Measuring, Modeling, and Forecasting Volatility and Correlations from High-Frequency DataVander Elst, Harry-Paul 20 May 2016 (has links)
This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
|
9 |
Modélisation et gestion sur les marchés obligatoires souverains / Modelling and management within sovereign bonds marketsMoungala, Wilfried Paterne 29 April 2013 (has links)
La crise financière de ces dernières années a relancé le débat sur le caractère dit « sans risque de défaut » des obligations souveraines. Face aux enjeux économiques et financiers, les établissements de crédit et les Institutions Financières ont du revoir les méthodes d’évaluation des obligations. Cette thèse a pour objectif la modélisation et la Gestion des prix obligataires et s’articule autour de quatre points. Dans le premier point, nous avons présenté les approches théoriques portant sur les modèles traditionnels des taux d’intérêt. Dans le second point, nous avons conçu un modèle test nommé M-M en discrétisant les modèles à temps continu du taux d’intérêt court et en recourant aux modèles de la famille GARCH. Ce modèle est construit en incorporant les effets niveau des taux d’intérêt à court terme et GARCH (1,1). Les résultats de l’estimation du modèle M-M suggère la nécessité de tenir compte des deux effets pour la modélisation des rendements des bons du Trésor américain. Le troisième point consiste à extraire les facteurs que l’on peut interpréter comme le niveau, la pente et la courbure. Ces facteurs sont extraits à partir de deux modèles qui sont des extensions dynamiques de la fonctionnelle de Nelson et Siegel. Les courbes des taux utilisés sont celles des Etats-Unis, de la France et de l’Afrique du Sud. La présence de l’Afrique du Sud dans cette étude est due à notre envie de traiter la structure par terme des taux d’intérêt d’un pays africain et aussi son économie émergente. A l’aide des proxies, et d’une ACP sur la courbe des taux de ces trois pays, ces facteurs ont été analysés sur la base de leur qualité d’ajustement. Le dernier point a pour but de traiter les indicateurs macroéconomiques et financiers qui peuvent expliquer les facteurs endogènes extraits. / The financial crisis of recent years has re-opened the debate of the so-called "risk-free" government bonds. Faced with economic and financial issues, credit institutions and financial institutions had reviewed the methods of bonds evaluation. The aim of this thesis is the modeling and management of the bonds prices and is organized on four points. In the first point, we present theoretical approaches on traditional models of interest rates. In the second point, we design a test model named M-M by discretizing the continuous-time models of the short interest rate and using the GARCH family models. This model is constructed by incorporating the level effect of the short term interest rates and GARCH (1,1) effect. The M-M estimation results suggest considering both effects for modeling Treasury bills yields. The third point determines the factors that can be interpreted as the level, slope and curvature, these factors are extracted from two models that are dynamics extensions of the Nelson and Siegel functional. We use Yield Curves of the United States, France and South Africa. The presence of South Africa in this study is due to our desire to treat the term structure of interest rates in an African country which is an emerging economy as well. These factors were analyzed on the basis of their goodness of fit. The last point aims to address macroeconomic and financial indicators that can explain the endogenous factors.
|
Page generated in 0.0743 seconds