• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 12
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 68
  • 68
  • 58
  • 55
  • 54
  • 15
  • 14
  • 14
  • 13
  • 12
  • 11
  • 10
  • 10
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

How Low Can You Go? : Quantitative Risk Measures in Commodity Markets

Forsgren, Johan January 2016 (has links)
The volatility model approach to forecasting Value at Risk is complemented with modelling of Expected Shortfalls using an extreme value approach. Using three models from the GARCH family (GARCH, EGARCH and GJR-GARCH) and assuming two conditional distributions, normal Gaussian and Student t’s distribution, to make predictions of VaR, the forecasts are used as a threshold for assigning losses to the distribution tail. The Expected Shortfalls are estimated assuming that the violations of VaR follow the Generalized Pareto distribution, and the estimates are evaluated. The results indicate that the most efficient model for making predictions of VaR is the asymmetric GJR-GARCH, and that assuming the t distribution generates conservative forecasts. In conclusion there is evidence that the commodities are characterized by asymmetry and conditional normality. Since no comparison is made, the EVT approach can not be deemed to be either superior or inferior to standard approaches to Expected Shortfall modeling, although the data intensity of the method suggest that a standard approach may be preferable.
12

Quantile-based methods for prediction, risk measurement and inference

Ally, Abdallah K. January 2010 (has links)
The focus of this thesis is on the employment of theoretical and practical quantile methods in addressing prediction, risk measurement and inference problems. From a prediction perspective, a problem of creating model-free prediction intervals for a future unobserved value of a random variable drawn from a sample distribution is considered. With the objective of reducing prediction coverage error, two common distribution transformation methods based on the normal and exponential distributions are presented and they are theoretically demonstrated to attain exact and error-free prediction intervals respectively. The second problem studied is that of estimation of expected shortfall via kernel smoothing. The goal here is to introduce methods that will reduce the estimation bias of expected shortfall. To this end, several one-step bias correction expected shortfall estimators are presented and investigated via simulation studies and compared with one-step estimators. The third problem is that of constructing simultaneous confidence bands for quantile regression functions when the predictor variables are constrained within a region is considered. In this context, a method is introduced that makes use of the asymmetric Laplace errors in conjunction with a simulation based algorithm to create confidence bands for quantile and interquantile regression functions. Furthermore, the simulation approach is extended to an ordinary least square framework to build simultaneous bands for quantiles functions of the classical regression model when the model errors are normally distributed and when this assumption is not fulfilled. Finally, attention is directed towards the construction of prediction intervals for realised volatility exploiting an alternative volatility estimator based on the difference of two extreme quantiles. The proposed approach makes use of AR-GARCH procedure in order to model time series of intraday quantiles and forecast intraday returns predictive distribution. Moreover, two simple adaptations of an existing model are also presented.
13

Measuring Extremes: Empirical Application on European Markets

Öztürk, Durmuş January 2015 (has links)
This study employs Extreme Value Theory and several univariate methods to compare their Value-at-Risk and Expected Shortfall predictive performance. We conduct several out-of-sample backtesting procedures, such as uncondi- tional coverage, independence and conditional coverage tests. The dataset in- cludes five different stock markets, PX50 (Prague, Czech Republic), BIST100 (Istanbul, Turkey), ATHEX (Athens, Greece), PSI20 (Lisbon, Portugal) and IBEX35 (Madrid, Spain). These markets have different financial histories and data span over twenty years. We analyze the global financial crisis period sep- arately to inspect the performance of these methods during the high volatility period. Our results support the most common findings that Extreme Value Theory is one of the most appropriate risk measurement tools. In addition, we find that GARCH family of methods, after accounting for asymmetry and fat tail phenomena, can be equally useful and sometimes even better than Extreme Value Theory based method in terms of risk estimation. Keywords Extreme Value Theory, Value-at-Risk, Expected Shortfall, Out-of-Sample Backtesting Author's e-mail ozturkdurmus@windowslive.com Supervisor's e-mail ies.avdulaj@gmail.com
14

Risk Management Project

Yan, Lu 02 May 2012 (has links)
In order to evaluate and manage portfolio risk, we separated this project into three sections. In the first section we constructed a portfolio with 15 different stocks and six options with different strategies. The portfolio was implemented in Interactive Brokers and rebalanced weekly through five holding periods. In the second section we modeled the loss distribution of the whole portfolio with normal and student-t distributions, we computed the Value-at-Risk and expected shortfall in detail for the portfolio loss in each holding week, and then we evaluated differences between the normal and student-t distributions. In the third section we applied the ARMA(1,1)-GARCH(1,1) model to simulate our assets and compared the polynomial tails with Gaussian and t-distribution innovations.
15

Market and Credit Risk Models and Management Report

Qu, Jing 02 May 2012 (has links)
This report is for MA575: Market and Credit Risk Models and Management, given by Professor Marcel Blais. In this project, three different methods for estimating Value at Risk (VaR) and Expected Shortfall (ES) are used, examined, and compared to gain insightful information about the strength and weakness of each method. In the first part of this project, a portfolio of underlying assets and vanilla options were formed in an Interactive Broker paper trading account. Value at Risk was calculated and updated weekly to measure the risk of the entire portfolio. In the second part of this project, Value at Risk was calculated using semi-parametric model. Then the weekly losses of the stock portfolio and the daily losses of the entire portfolio were both fitted into ARMA(1,1)-GARCH(1,1), and the estimated parameters were used to find their conditional value at risks (CVaR) and the conditional expected shortfalls (CES).
16

Risk Management Project

Shen, Chen 02 May 2012 (has links)
In order to evaluate and manage portfolio risk, we separated this project into three sections. In the first section we constructed a portfolio with 15 different stocks and six options with different strategies. The portfolio was implemented in Interactive Brokers and rebalanced weekly through five holding periods. In the second section we modeled the loss distribution of the whole portfolio with normal and student-t distributions, we computed the Value-at-Risk and expected shortfall in detail for the portfolio loss in each holding week, and then we evaluated differences between the normal and student-t distributions. In the third section we applied the ARMA(1,1)-GARCH(1,1) model to simulate our assets and compared the polynomial tails with Gaussian and t-distribution innovations.
17

Empirical Analysis of Value at Risk and Expected Shortfall in Portfolio Selection Problem

Ding, Liyuan 1988- 14 March 2013 (has links)
Safety first criterion and mean-shortfall criterion both explore cases of assets allocation with downside risk. In this paper, I compare safety first portfolio selection problem and mean-shortfall portfolio optimization problem, considering risk averse investors in practice. Safety first portfolio selection uses Value at Risk (VaR) as a risk measure, and mean-shortfall portfolio optimization uses expected shortfall as a risk measure, respectively. VaR is estimated by implementing extreme theory using a semi-parametric method. Expected shortfall is estimated by two nonparametric methods: a natural estimation and a kernel-weighted estimation. I use daily data on three international stock indices, ranging from January 1986 to February 2012, to provide empirical evidence in asset allocations and illustrate the performances of safety first and mean-shortfall with their risk measures. Also, the historical data has been divided in two ways. One is truncated at year 1998 and explored the performance during tech boom and financial crisis. the mean-shortfall portfolio optimization with the kernel-weighted method performed better than the safety first criterion, while the safety first criterion was better than the mean-shortfall portfolio optimization with the natural estimation method.
18

Applications of constrained non-parametric smoothing methods in computing financial risk

Wong, Chung To (Charles) January 2008 (has links)
The aim of this thesis is to improve risk measurement estimation by incorporating extra information in the form of constraint into completely non-parametric smoothing techniques. A similar approach has been applied in empirical likelihood analysis. The method of constraints incorporates bootstrap resampling techniques, in particular, biased bootstrap. This thesis brings together formal estimation methods, empirical information use, and computationally intensive methods. In this thesis, the constraint approach is applied to non-parametric smoothing estimators to improve the estimation or modelling of risk measures. We consider estimation of Value-at-Risk, of intraday volatility for market risk, and of recovery rate densities for credit risk management. Firstly, we study Value-at-Risk (VaR) and Expected Shortfall (ES) estimation. VaR and ES estimation are strongly related to quantile estimation. Hence, tail estimation is of interest in its own right. We employ constrained and unconstrained kernel density estimators to estimate tail distributions, and we estimate quantiles from the fitted tail distribution. The constrained kernel density estimator is an application of the biased bootstrap technique proposed by Hall & Presnell (1998). The estimator that we use for the constrained kernel estimator is the Harrell-Davis (H-D) quantile estimator. We calibrate the performance of the constrained and unconstrained kernel density estimators by estimating tail densities based on samples from Normal and Student-t distributions. We find a significant improvement in fitting heavy tail distributions using the constrained kernel estimator, when used in conjunction with the H-D quantile estimator. We also present an empirical study demonstrating VaR and ES calculation. A credit event in financial markets is defined as the event that a party fails to pay an obligation to another, and credit risk is defined as the measure of uncertainty of such events. Recovery rate, in the credit risk context, is the rate of recuperation when a credit event occurs. It is defined as Recovery rate = 1 - LGD, where LGD is the rate of loss given default. From this point of view, the recovery rate is a key element both for credit risk management and for pricing credit derivatives. Only the credit risk management is considered in this thesis. To avoid strong assumptions about the form of the recovery rate density in current approaches, we propose a non-parametric technique incorporating a mode constraint, with the adjusted Beta kernel employed to estimate the recovery density function. An encouraging result for the constrained Beta kernel estimator is illustrated by a large number of simulations, as genuine data are very confidential and difficult to obtain. Modelling high frequency data is a popular topic in contemporary finance. The intraday volatility patterns of standard indices and market-traded assets have been well documented in the literature. They show that the volatility patterns reflect the different characteristics of different stock markets, such as double U-shaped volatility pattern reported in the Hang Seng Index (HSI). We aim to capture this intraday volatility pattern using a non-parametric regression model. In particular, we propose a constrained function approximation technique to formally test the structure of the pattern and to approximate the location of the anti-mode of the U-shape. We illustrate this methodology on the HSI as an empirical example.
19

Postupy homogenizace pojistného kmene

Hrouz, David January 2015 (has links)
This diploma thesis deals with transferring the risk of a insurance company to another subject. The basic requirement is to homogenize the selected insurance portfolio. The amount of capital required is determined by identifying and quantifying the risk. Adjusted indicator of Economic value added (EVA) determines the optimal ratio of the retention and the risk transferred. There are several factors that can affect the amount of the retained risk. The main objective is to determine the amount of the optimal retention itself and select the appropriate type of reinsurance. The recommendation is based on the current development of expenses on insurance claims.
20

Řízení rizik v komerční pojišťovně

Strýček, Tomáš January 2016 (has links)
The diploma thesis deals with current issues of risk management in a selected insurance company. The thesis is conceptually divided into two parts the literature recherche and the empirical part. The first section introduces the individual risks and the basic methods of the quantification of the risks which affect the functioning of commercial insurances. A new system of European insurance regulation, Solvency II, is also described. The empirical part of the diploma thesis deals with the risk quantification of the selected insurance company according to the standard and internal model. The thesis is concluded with the evaluation of the risk management in the selected insurance company and of the company preparedness for the regulatory regime Solvency II. Based on this quantification, the recommendations are put forward to improve the risk management of the selected insurer.

Page generated in 0.0504 seconds