Spelling suggestions: "subject:"valueatrisk""
341 |
[en] VAR EVALUATION OF EMERGING AND DEVELOPED MARKETS VIA DYNAMIC COPULA MODELS / [pt] AVALIAÇÃO DE VAR DE MERCADOS EMERGENTES E DESENVOLVIDOS VIA MODELOS DE CÓPULAS DINÂMICASFLAVIO LUCIO DE OLIVEIRA COELHO 30 August 2013 (has links)
[pt] Esta dissertação tem por objetivo investigar como a crise do subprime
impactou a estrutura de dependência entre os mercados emergentes e
desenvolvidos, utilizando como proxy para esses mercados os índices MSCI
(Morgan Stanley Capital International). A metodologia proposta é baseada na
construção de distribuições bivariadas através de cópulas condicionais. A
distribuição marginal de cada um dos índices é obtida via ajuste de modelos
GARCH univariados e a modelagem de dependência é realizada através das
cópulas normal, normal GAS (Generalised Autoregressive Score) e Joe-
Clayton simétrica, considerando os parâmetros fixos (forma estática) ou
variantes no tempo (forma dinâmica). Diante dos resultados obtidos, a cópula
normal GAS (variantes no tempo) com quebra estrutural se mostrou a mais
adequada para capturar a dependência entre os retornos dos mercados
emergentes e desenvolvidos. Através do arcabouço utilizado pode-se verificar
que as medidas de correlação e de dependência de cauda entre os mercados
emergentes e desenvolvidos aumentaram significativamente no período da crise
do suprime. Por fim, avaliou-se o ajuste das diversas cópulas aqui propostas via
VaR (Value at Risk), verificando-se que a cópula normal GAS apresentou o
melhor ajuste. / [en] The aim of this dissertation is to analyze how the subprime crisis impacted
the dependence structure among the emerging and developed markets by using
the MSCI (Morgan Stanley Capital International) market index as proxy for each
of these markets. The proposed methodology is based on the construction of
bivariate distributions via conditional copulas. The marginal distribution for each
of the indexes makes use of univariate GARCH models and model dependence
is captured via the following copulas: normal, normal GAS (Generalised
Autoregressive Score) and Joe-Clayton symmetric considering both fixed
parameters (static framework) and time varying parameters (dynamic
framework). Our results show that the normal GAS copula with structural break
was the most adequate to capture dependence between the returns of emerging
and developed markets. Throughout the proposed framework it was possible to
infer that correlation and tail dependence measures between these markets
sharply increased during the subprime crisis. Finally VaR (Value at Risk)
coverage was used as goodness of fit measure, and on this metric the normal
GAS copula has also outperformed the others.
|
342 |
風險基礎資本與涉險值運用在保險監理上之比較 / The Comparison of RBC and VaR in the Insurance Regulation林姿婷, Lin, Tzy-Ting Unknown Date (has links)
確保保險公司之清償能力是保險監理單位之首要目標,監理單位使用各種不同的監理制度以確保保險公司的財務體質,並防止保單持有人因為保險公司失去清償能力所遭致之損失。在各種監理制度中,RBC監理制度主要是衡量保險公司的資本適足性並且提供監理單位採取相關監理行動的準則;VaR監理制度則是目前銀行業之監理所嘗試採取的新監理方式,而且VaR也被廣泛運用在銀行內部的風險管理系統中,由銀行監理的發展趨勢看來,可以預期保險監理將來也會以VaR監理制度為主。
本研究的主要目的在於探討VaR監理制度適用在保險監理制度上的可行性以及與現行RBC監理制度的比較。在探討VaR監理制度的可行性前,本研究先就VaR監理制度運用在保險監理的前提以及影響保險公司失去清償能力的原因進行探討。
在瞭解影響保險公司失去清償能力的原因後,本研究分別對於在VaR監理制度下保險公司如何分別針對各種不同的風險因子決定所需持有的資本額度。經過相關文獻的探討以及考慮保險業的行業特性,本研究建議市場風險與核保風險可以用VaR計算其資本額度;信用風險由於尚未有十分完善的量化模型,所以本研究建議應以徵信方式因應此一風險,而業務風險則是以規定一固定比率的資本額度因應之。本研究也建議待保險公司累積足夠的VaR使用經驗後,保險監理制度可以開放使用預先承諾法。
在運用VaR於保險監理上時,本研究也建議監理單位必須注意有關VaR的實行風險與模型風險的影響,同時也強調監理單位的檢核與市場制度的力量是VaR監理制度能夠充分運作的必要條件;此外,由於制度實施的初期,無法驗證模型與資料的可用性,所以仍必須輔以最低固定比率的要求,以確保保險公司的清償能力。
在探討VaR運用在保險監理制度上的可行性後,本研究將進一步比較VaR與現行RBC監理制度的比較。本研究主要是由制度實行的難易程度、衡量資本適足的準確性,以及監理的成本三方面進行比較。制度實行的難易程度主要是比較VaR與RBC制度的複雜度與可行性,以及與公司內部風險管理和全球金融監理趨勢的整合程度。衡量資本適足的準確性主要是比較二種制度何者更可以衡量保險公司所面臨的各種風險、清償能力的效力,以及保險公司投資組合的風險分散效果。至於監理的成本則可分為監理者、保險公司與社會成本三方面來探討。
透過本研究的比較結果發現VaR監理制度除了在制度的複雜度與可行性較RBC制度差以外,其他項目皆優於RBC監理制度。除此之外,VaR與RBC都各自有其監理上的道德風險。本研究建議如同銀行監理一般,保險監理制度應朝向VaR監理制度的趨勢前進,以更可以確保保險公司的清償能力以及投保大眾的權益。 / Assuring insurance company solvency has always been the focal point of insurance regulation. Regulators use various methods to promote insurers' financial strength and protect policyholders from losses due to insolvency. Among these methods, risk-based capital (RBC) is used to measure the insurer's capital adequacy and provide the relative action rule for the regulator, and the VaR (value-at-risk) regulation is new regulatory type the bank regulator attempt to adopt. Besides the regulatory application, VaR is also used in bank's risk management system broadly. We can expect the VaR-type regulation will be the new insurance regulation in the future according to the development of bank's regulation.
The methodology of this study adopt is literature review. The most important purpose of this study is to explore the feasibility of VaR-type insurance regulation and compare the VaR regulation with current RBC regulation. Before the regulation system examination, this study firstly discusses the presupposition of the VaR regulation application and the causes of insurer insolvency.
For the purpose of developing the VaR-type capital requirement in insurance regulation, this study proposes that market and underwriting risk capital requirement can be directly calculated in VaR; credit and business risk capital requirement should be regulated a fixed-rate capital amount. This study also proposes the application of precommitment approach when the regulator assure the insurer accumulate good experience in VaR. In addition, this study also addresses some points for attention of VaR insurance regulation.
The other purpose of this study is to compare the RBC and VaR through the regulatory implementation, solvency measurement, and regulatory cost. The result of this study indicates that VaR is superior to RBC in any aspect, besides the complexity and feasibility. In addition, VaR and RBC both have their own regulatory moral hazard. This study suggests VaR should be used in the insurance regulation as other financial regulation in the future.
|
343 |
公司規模效果之涉險值研究林建秀, Lin, Chien-Hsiu Unknown Date (has links)
本文嘗試利用涉險值(VaR)的估計來衡量投資組合風險和規模效果之間的關係。在歷史模擬法、變異-共變異法及極端值法估計VaR的結果中,皆得到小規模策略投資組合之可能損失風險額大於大規模策略投資組合。由VaR的估計,我們可得以下結論:規模溢酬和風險具有高度相關。小規模策略投資組合的風險高於大規模策略投資組合,故需具備較大規模策略投資組合為高之風險溢酬。 而投資人若進行買進小規模策略投資組合及賣出大規模策略投資組合,則因所承擔之風險較高,故所獲致優於大盤的績效,便在於彌補其所承擔的風險。此結果支持理性資產定價模式(Rational Asset Pricing)的論點。
|
344 |
En undersökning av VaR-modeller med Kupiecs BacktestRuner, Carl-Johan, Linzander, Martin January 2009 (has links)
<p>SAMMANDRAG</p><p>Historisk Simulation, Delta-Normal och RiskMetrics prestation utvärderas med hjälp av Kupiecs Backtest. Value at Risk (VaR) beräknas med tre olika konfidensnivåer utifrån Affärsvärldens Generalindex och HSBC kopparindex. Utifrån överträdelser från verkligt utfall undersöks vilken VaR-modell som estimerar marknadsrisken bäst. VaR-modellernas prestation jämförs, och i analysen utreds hur konfidensnivå och tillgångars egenskaper påverkar VaR-modellernas prestation. Resultaten visar att Historisk Simulation presterar bättre än Delta-Normal och RiskMetrics på den högsta konfidensnivån vilket troligtvis beror på att RiskMetrics och Delta-Normal antar normalfördelning. RiskMetrics och Delta-Normal presterar dock bättre än Historisk Simulation på den lägsta konfidensnivån vilket sannolikt är en följd av att Historisk Simulation anpassar sig långsammare till volatilitetsförändringar. Undersökningen tyder även på att avtagningsfaktorn som RiskMetrics använder får minskad effekt vid högre konfidensnivåer varför skillnaden mellan Delta-Normals och RiskMetrics prestation är marginell på dessa nivåer.</p>
|
345 |
Essays on Modelling and Forecasting Financial Time SeriesCoroneo, Laura 28 August 2009 (has links)
This thesis is composed of three chapters which propose some novel approaches to model and forecast financial time series. The first chapter focuses on high frequency financial returns and proposes a quantile regression approach to model their intraday seasonality and dynamics. The second chapter deals with the problem of forecasting the yield curve including large datasets of macroeconomics information. While the last chapter addresses the issue of modelling the term structure of interest rates.
The first chapter investigates the distribution of high frequency financial returns, with special emphasis on the intraday seasonality. Using quantile regression, I show the expansions and shrinks of the probability law through the day for three years of 15 minutes sampled stock returns. Returns are more dispersed and less concentrated around the median at the hours near the opening and closing. I provide intraday value at risk assessments and I show how it adapts to changes of dispersion over the day. The tests performed on the out-of-sample forecasts of the value at risk show that the model is able to provide good risk assessments and to outperform standard Gaussian and Student’s t GARCH models.
The second chapter shows that macroeconomic indicators are helpful in forecasting the yield curve. I incorporate a large number of macroeconomic predictors within the Nelson and Siegel (1987) model for the yield curve, which can be cast in a common factor model representation. Rather than including macroeconomic variables as additional factors, I use them to extract the Nelson and Siegel factors. Estimation is performed by EM algorithm and Kalman filter using a data set composed by 17 yields and 118 macro variables. Results show that incorporating large macroeconomic information improves the accuracy of out-of-sample yield forecasts at medium and long horizons.
The third chapter statistically tests whether the Nelson and Siegel (1987) yield curve model is arbitrage-free. Theoretically, the Nelson-Siegel model does not ensure the absence of arbitrage opportunities. Still, central banks and public wealth managers rely heavily on it. Using a non-parametric resampling technique and zero-coupon yield curve data from the US market, I find that the no-arbitrage parameters are not statistically different from those obtained from the Nelson and Siegel model, at a 95 percent confidence level. I therefore conclude that the Nelson and Siegel yield curve model is compatible with arbitrage-freeness.
|
346 |
Essays on banking, credit and interest ratesRoszbach, Kasper January 1998 (has links)
This dissertation consists of four papers, each with an application of a discrete dependent variable model, censored regression or duration model to a credit market phenomenon or monetary policy question. The first three essays deal with bank lending policy, while the last one studies interest rate policy by Central Banks. In the first essay, a bivariate probit model is estimated to contrast the factors that influence banks’ loan granting decision and individuals’ risk of default. This model is used as a tool to construct a Value at Risk measure of the credit risk involved in a portfolio of consumer loans and to investigate the efficiency of bank lending policy. The second essay takes the conclusions from the first paper as a starting point. It investigates if the fact that banks do not minimize default risk can be explained by the existence of return maximization policy. For this purpose, a Tobit model with sample selection effects and variable censoring limits is developed and estimated on the survival times of consumer loans. The third paper focuses on dormancy, instead of default risk or survival time, as the most important factor affecting risk and return in bank lending. By means of a duration model the factors determining the transition from an active status to dormancy are studied. The estimated model is used to predict the expected durations to dormancy and to analyze the expected profitability for a sample loan applicants. In the fourth paper, the discrete nature of Central Bank interest rate policy is studied. A grouped data model, that can take the long periods of time without changes in the repo rate by the Central Bank into account, is estimated on weekly Swedish data. The model is found to be reasonably good at predicting interest rate changes. / Diss. (sammanfattning) Stockholm : Handelshögsk.
|
347 |
Risk Measurement, Management And Option Pricing Via A New Log-normal Sum Approximation MethodZeytun, Serkan 01 October 2012 (has links) (PDF)
In this thesis we mainly focused on the usage of the Conditional Value-at-Risk (CVaR) in
risk management and on the pricing of the arithmetic average basket and Asian options in
the Black-Scholes framework via a new log-normal sum approximation method. Firstly, we
worked on the linearization procedure of the CVaR proposed by Rockafellar and Uryasev. We
constructed an optimization problem with the objective of maximizing the expected return
under a CVaR constraint. Due to possible intermediate payments we assumed, we had to deal
with a re-investment problem which turned the originally one-period problem into a multiperiod
one. For solving this multi-period problem, we used the linearization procedure of
CVaR and developed an iterative scheme based on linear optimization. Our numerical results
obtained from the solution of this problem uncovered some surprising weaknesses of the use
of Value-at-Risk (VaR) and CVaR as a risk measure.
In the next step, we extended the problem by including the liabilities and the quantile hedging
to obtain a reasonable problem construction for managing the liquidity risk. In this problem
construction the objective of the investor was assumed to be the maximization of the probability of liquid assets minus liabilities bigger than a threshold level, which is a type of quantile hedging. Since the quantile hedging is not a perfect hedge, a non-zero probability of having
a liability value higher than the asset value exists. To control the amount of the probable deficient
amount we used a CVaR constraint. In the Black-Scholes framework, the solution of
this problem necessitates to deal with the sum of the log-normal distributions. It is known that
sum of the log-normal distributions has no closed-form representation. We introduced a new,
simple and highly efficient method to approximate the sum of the log-normal distributions using
shifted log-normal distributions. The method is based on a limiting approximation of the
arithmetic mean by the geometric mean. Using our new approximation method we reduced
the quantile hedging problem to a simpler optimization problem.
Our new log-normal sum approximation method could also be used to price some options in
the Black-Scholes model. With the help of our approximation method we derived closed-form
approximation formulas for the prices of the basket and Asian options based on the arithmetic
averages. Using our approximation methodology combined with the new analytical pricing
formulas for the arithmetic average options, we obtained a very efficient performance for
Monte Carlo pricing in a control variate setting. Our numerical results show that our control
variate method outperforms the well-known methods from the literature in some cases.
|
348 |
En undersökning av VaR-modeller med Kupiecs BacktestRuner, Carl-Johan, Linzander, Martin January 2009 (has links)
SAMMANDRAG Historisk Simulation, Delta-Normal och RiskMetrics prestation utvärderas med hjälp av Kupiecs Backtest. Value at Risk (VaR) beräknas med tre olika konfidensnivåer utifrån Affärsvärldens Generalindex och HSBC kopparindex. Utifrån överträdelser från verkligt utfall undersöks vilken VaR-modell som estimerar marknadsrisken bäst. VaR-modellernas prestation jämförs, och i analysen utreds hur konfidensnivå och tillgångars egenskaper påverkar VaR-modellernas prestation. Resultaten visar att Historisk Simulation presterar bättre än Delta-Normal och RiskMetrics på den högsta konfidensnivån vilket troligtvis beror på att RiskMetrics och Delta-Normal antar normalfördelning. RiskMetrics och Delta-Normal presterar dock bättre än Historisk Simulation på den lägsta konfidensnivån vilket sannolikt är en följd av att Historisk Simulation anpassar sig långsammare till volatilitetsförändringar. Undersökningen tyder även på att avtagningsfaktorn som RiskMetrics använder får minskad effekt vid högre konfidensnivåer varför skillnaden mellan Delta-Normals och RiskMetrics prestation är marginell på dessa nivåer.
|
349 |
Optimal Reinsurance Designs: from an Insurer’s PerspectiveWeng, Chengguo 09 1900 (has links)
The research on optimal reinsurance design dated back to the 1960’s. For nearly half a century, the quest for optimal reinsurance designs has remained a fascinating subject, drawing significant interests from both academicians and practitioners. Its fascination lies in its potential as an effective risk management tool for the insurers. There are many ways of formulating the optimal design of reinsurance, depending on the chosen objective and constraints. In this thesis, we address the problem of optimal reinsurance designs from an insurer’s perspective. For an insurer, an appropriate use of the reinsurance helps to reduce the adverse risk exposure and improve the overall viability of the underlying business. On the other hand, reinsurance incurs additional cost to the insurer in the form of reinsurance premium. This implies a classical risk and reward tradeoff faced by the insurer.
The primary objective of the thesis is to develop theoretically sound and yet practical solution in the quest for optimal reinsurance designs. In order to achieve such an objective, this thesis is divided into two parts. In the first part, a number of reinsurance models are developed and their optimal reinsurance treaties are derived explicitly. This part focuses on the risk measure minimization reinsurance models and discusses the optimal reinsurance treaties by exploiting two of the most common risk measures known as the Value-at-Risk (VaR) and the Conditional Tail Expectation (CTE). Some additional important economic factors such as the reinsurance premium budget, the insurer’s profitability are also considered. The second part proposes an innovative method in formulating the reinsurance models, which we refer as the empirical approach since it exploits explicitly the insurer’s empirical loss data. The empirical approach has the advantage that it is practical and intuitively appealing. This approach is motivated by the difficulty that the reinsurance models are often infinite dimensional optimization problems and hence the explicit solutions are achievable only in some special cases. The empirical approach effectively reformulates the optimal reinsurance problem into a finite dimensional optimization problem. Furthermore, we demonstrate that the second-order conic programming can be used to obtain the optimal solutions for a wide range of reinsurance models formulated by the empirical approach.
|
350 |
Value at Risk: A Standard Tool in Measuring Risk : A Quantitative Study on Stock PortfolioOfe, Hosea, Okah, Peter January 2011 (has links)
The role of risk management has gained momentum in recent years most notably after the recent financial crisis. This thesis uses a quantitative approach to evaluate the theory of value at risk which is considered a benchmark to measure financial risk. The thesis makes use of both parametric and non parametric approaches to evaluate the effectiveness of VAR as a standard tool in measuring risk of stock portfolio. This study uses the normal distribution, student t-distribution, historical simulation and the exponential weighted moving average at 95% and 99% confidence levels on the stock returns of Sonny Ericsson, Three Months Swedish Treasury bill (STB3M) and Nordea Bank. The evaluations of the VAR models are based on the Kupiec (1995) Test. From a general perspective, the results of the study indicate that VAR as a proxy of risk measurement has some imprecision in its estimates. However, this imprecision is not all the same for all the approaches. The results indicate that models which assume normality of return distribution display poor performance at both confidence levels than models which assume fatter tails or have leptokurtic characteristics. Another finding from the study which may be interesting is the fact that during the period of high volatility such as the financial crisis of 2008, the imprecision of VAR estimates increases. For the parametric approaches, the t-distribution VAR estimates were accurate at 95% confidence level, while normal distribution approach produced inaccurate estimates at 95% confidence level. However both approaches were unable to provide accurate estimates at 99% confidence level. For the non parametric approaches the exponentially weighted moving average outperformed the historical simulation approach at 95% confidence level, while at the 99% confidence level both approaches tend to perform equally. The results of this study thus question the reliability on VAR as a standard tool in measuring risk on stock portfolio. It also suggest that more research should be done to improve on the accuracy of VAR approaches, given that the role of risk management in today’s business environment is increasing ever than before. The study suggest VAR should be complemented with other risk measures such as Extreme value theory and stress testing, and that more than one back testing techniques should be used to test the accuracy of VAR.
|
Page generated in 0.0289 seconds