Spelling suggestions: "subject:"value off risk"" "subject:"value off disk""
191 |
The impact of the market risk of capital regulations on bank activitiesEksi, Emrah January 2006 (has links)
Banking has a unique role in the well-being of an economy. This role makes banks one of the most heavily regulated and supervised industries. In order to strengthen the soundness and stability of banking systems, regulators require banks to hold adequate capital. While credit risk was the only risk that was covered by the original Basle Accord, with the 1996 amendment, banks have also been required to assign capital for their market risk starting from 1998. In this research, the impact of the market risk capital regulations on bank capital levels and derivative activities is investigated. In addition, this study also evaluates the impact of using different approaches that are allowed to be used while calculating the required market risk capital, as well as the accuracy of VaR models. The implementation of the market risk capital regulations can influence banks either by increasing their capital or by decreasing their trading activities and in particular trading derivative activities. The literature review concerning capital regulations illustrates that in particular the impact of these regulations on bank capital levels and derivative activities is an issue that has not yet been explored. In order to fill this gap, the changes in capital and derivatives usage ratios are modelled by using a partial adjustment framework. The main results of this analysis suggest that the implementation of the market risk capital regulations has a significant and positive impact on the risk-based capital ratios of BHCs. However, the results do not indicate any impact of these regulations on derivative activities. The empirical findings also demonstrate that there is no significant relationship between capital and derivatives. The market risk capital regulations allow the use of either a standardised approach or the VaR methodologies to determine the required capital amounts to cover market risk. In order to evaluate these approaches, firstly differences on bank VaR practices are investigated by employing a documentary analysis. The documentary analysis is conducted to demonstrate the differences in bank VaR practices by comparing the VaR models of 25 international banks. The survey results demonstrate that there, is no industry consensus on the methodology for calculating VaR. This analysis also indicates that the assumptions in estimating VaR models vary considerably among financial institutions. Therefore, it is very difficult for financial market participants to make comparisons across institutions by considering single VaR values. Secondly, the required capital amounts are calculated for two hypothetical foreign exchange portfolios by using both the standardised and three different VaR methodologies, and then these capital amounts are compared. These simulations are conducted to understand to what extent the market risk capital regulations approaches produce different outcomes on the capital levels. The results indicate that the VaR estimates are dependent upon the VaR methodology. Thirdly, three backtesting methodologies are applied to the VaR models. The results indicate that a VaR model that provides accurate estimates for a specific portfolio could fail when the portfolio composition changes. The results of the simulations indicate that the market risk capital regulations do not provide a `level playing field' for banks that are subject to these regulations. In addition, giving an option to banks to determine the VaR methodology could create a moral hazard problem as banks may choose an inaccurate model that provides less required capital amounts.
|
192 |
Optimal Deployment of Direction-finding SystemsKim, Suhwan 03 October 2013 (has links)
A direction-finding system with multiple direction finders (DFs) is a military intelligence system designed to detect the positions of transmitters of radio frequencies. This dissertation studies three decision problems associated with the direction-finding system.
The first part of this dissertation is to prescribe DF deployment to maximize the effectiveness with which transmitter positions are estimated in an area of interest (AOI). Three methods are presented to prescribe DF deployment. The first method uses Stansfield’s probability density function to compute objective function coefficients numerically. The second and the third employ surrogate measures of effectiveness as objective functions. The second method, like the first, involves complete enumerations; the third formulates the problem as an integer program and solves it with an efficient network-based label-setting algorithm. Our results show that the third method, which involved use of a surrogate measure as an objective function and an exact label-setting algorithm, is most effective.
The second part of this dissertation is to minimize the number of DFs to cover an AOI effectively, considering obstacles between DFs and transmitters. We formulate this problem as a partial set multicover problem in which at least -fraction of the likely transmitter positions must be covered, each by at least direction finders. We present greedy heuristics with random selection rules for the partial set multicover problem, estimating statistical bounds on unknown optimal values. Our results show that the greedy heuristic with column selection rule, which gives priority for selecting a column that advances more rows to k-coverage, performs best on the partial set multicover problems. Results also show that the heuristic with random row and column selection rules is the best of the heuristics with respect to statistical bounds.
The third part of this dissertation deals with the problem of deploying direction finders with the goal of maximizing the effectiveness with which transmitter positions can be estimated in an AOI while hedging against enemy threats. We present four formulations, considering the probability that a direction finder deployed at a location will survive enemy threats over the planning horizon (i.e., not be rendered inoperative by an attack). We formulate the first two as network flow problems and present an efficient label-setting algorithm. The third and the fourth use the well-known Conditional Value at Risk (CVaR) risk measure to deal with the risk of being rendered inoperative by the enemy. Computational results show that risk-averse decision models tend to deploy some or all DFs in locations that are not close to the enemy to reduce risk. Results also show that a direction-finding system with 5 DFs provides improved survivability under enemy threats.
|
193 |
Public Debt Management In Turkey With Stochastic Optimization ApproachCelebi, Nuray 01 December 2005 (has links) (PDF)
The Prime Ministry of Undersecretariat of Treasury maintaining the financial administration of Republic of Turkey has several tasks to handle one of which is to manage the government&rsquo / s debt in a way that minimizes the cost regarding risk. Choosing the right instrument and maturity composition that has the least cost and risk is the debt management problem to be dealt with and is affected by many stochastic factors.
The objective of this thesis is the optimization of the debt management problem of the Turkish Government via a stochastic simulation framework under the constraints of changes in portfolio positions. Value-at-Risk of the optimal portfolio is calculated to measure market risk. Macroeconomic variables in the optimization problem are modeled with econometric models like autoregressive processes (AR), autoregressive integrated moving average processes (ARIMA) and generalized autoregressive conditionally heteroscedastic
(GARCH) processes. The simulation horizon is 2005-2015. Debt
portfolio is optimized at 2006 and 2015 where the representative scenarios for the optimization are found by clustering the previously generated 25,000 scenarios into 30 groups at each stage.
|
194 |
Portfolio selection and hedge funds : linearity, heteroscedasticity, autocorrelation and tail-riskBianchi, Robert John January 2007 (has links)
Portfolio selection has a long tradition in financial economics and plays an integral role in investment management. Portfolio selection provides the framework to determine optimal portfolio choice from a universe of available investments. However, the asset weightings from portfolio selection are optimal only if the empirical characteristics of asset returns do not violate the portfolio selection model assumptions. This thesis explores the empirical characteristics of traditional assets and hedge fund returns and examines their effects on the assumptions of linearity-in-the-mean testing and portfolio selection. The encompassing theme of this thesis is the empirical interplay between traditional assets and hedge fund returns. Despite the paucity of hedge fund research, pension funds continue to increase their portfolio allocations to global hedge funds in an effort to pursue higher risk-adjusted returns. This thesis presents three empirical studies which provide positive insights into the relationships between traditional assets and hedge fund returns. The first two empirical studies examine an emerging body of literature which suggests that the relationship between traditional assets and hedge fund returns is non-linear. For mean-variance investors, non-linear asset returns are problematic as they do not satisfy the assumption of linearity required for the covariance matrix in portfolio selection. To examine the linearity assumption as it relates to a mean-variance investor, a hypothesis test approach is employed which investigates the linearity-in-the-mean of traditional assets and hedge funds. The findings from the first two empirical studies reveal that conventional linearity-in-the-mean tests incorrectly conclude that asset returns are nonlinear. We demonstrate that the empirical characteristics of heteroscedasticity and autocorrelation in asset returns are the primary sources of test mis-specification in these linearity-in-the-mean hypothesis tests. To address this problem, an innovative approach is proposed to control heteroscedasticity and autocorrelation in the underlying tests and it is shown that traditional assets and hedge funds are indeed linear-in-the-mean. The third and final study of this thesis explores traditional assets and hedge funds in a portfolio selection framework. Following the theme of the previous two studies, the effects of heteroscedasticity and autocorrelation are examined in the portfolio selection context. The characteristics of serial correlation in bond and hedge fund returns are shown to cause a downward bias in the second sample moment. This thesis proposes two methods to control for this effect and it is shown that autocorrelation induces an overallocation to bonds and hedge funds. Whilst heteroscedasticity cannot be directly examined in portfolio selection, empirical evidence suggests that heteroscedastic events (such as those that occurred in August 1998) translate into the empirical feature known as tail-risk. The effects of tail-risk are examined by comparing the portfolio decisions of mean-variance analysis (MVA) versus mean-conditional value at risk (M-CVaR) investors. The findings reveal that the volatility of returns in a MVA portfolio decreases when hedge funds are included in the investment opportunity set. However, the reduction in the volatility of portfolio returns comes at a cost of undesirable third and fourth moments. Furthermore, it is shown that investors with M-CVaR preferences exhibit a decreasing demand for hedge funds as their aversion for tail-risk increases. The results of the thesis highlight the sensitivities of linearity tests and portfolio selection to the empirical features of heteroscedasticity, autocorrelation and tail-risk. This thesis contributes to the literature by providing refinements to these frameworks which allow improved inferences to be made when hedge funds are examined in linearity and portfolio selection settings.
|
195 |
Applications of constrained non-parametric smoothing methods in computing financial riskWong, Chung To (Charles) January 2008 (has links)
The aim of this thesis is to improve risk measurement estimation by incorporating extra information in the form of constraint into completely non-parametric smoothing techniques. A similar approach has been applied in empirical likelihood analysis. The method of constraints incorporates bootstrap resampling techniques, in particular, biased bootstrap. This thesis brings together formal estimation methods, empirical information use, and computationally intensive methods. In this thesis, the constraint approach is applied to non-parametric smoothing estimators to improve the estimation or modelling of risk measures. We consider estimation of Value-at-Risk, of intraday volatility for market risk, and of recovery rate densities for credit risk management. Firstly, we study Value-at-Risk (VaR) and Expected Shortfall (ES) estimation. VaR and ES estimation are strongly related to quantile estimation. Hence, tail estimation is of interest in its own right. We employ constrained and unconstrained kernel density estimators to estimate tail distributions, and we estimate quantiles from the fitted tail distribution. The constrained kernel density estimator is an application of the biased bootstrap technique proposed by Hall & Presnell (1998). The estimator that we use for the constrained kernel estimator is the Harrell-Davis (H-D) quantile estimator. We calibrate the performance of the constrained and unconstrained kernel density estimators by estimating tail densities based on samples from Normal and Student-t distributions. We find a significant improvement in fitting heavy tail distributions using the constrained kernel estimator, when used in conjunction with the H-D quantile estimator. We also present an empirical study demonstrating VaR and ES calculation. A credit event in financial markets is defined as the event that a party fails to pay an obligation to another, and credit risk is defined as the measure of uncertainty of such events. Recovery rate, in the credit risk context, is the rate of recuperation when a credit event occurs. It is defined as Recovery rate = 1 - LGD, where LGD is the rate of loss given default. From this point of view, the recovery rate is a key element both for credit risk management and for pricing credit derivatives. Only the credit risk management is considered in this thesis. To avoid strong assumptions about the form of the recovery rate density in current approaches, we propose a non-parametric technique incorporating a mode constraint, with the adjusted Beta kernel employed to estimate the recovery density function. An encouraging result for the constrained Beta kernel estimator is illustrated by a large number of simulations, as genuine data are very confidential and difficult to obtain. Modelling high frequency data is a popular topic in contemporary finance. The intraday volatility patterns of standard indices and market-traded assets have been well documented in the literature. They show that the volatility patterns reflect the different characteristics of different stock markets, such as double U-shaped volatility pattern reported in the Hang Seng Index (HSI). We aim to capture this intraday volatility pattern using a non-parametric regression model. In particular, we propose a constrained function approximation technique to formally test the structure of the pattern and to approximate the location of the anti-mode of the U-shape. We illustrate this methodology on the HSI as an empirical example.
|
196 |
Incorporating discontinuities in value-at-risk via the poisson jump diffusion model and variance gamma modelLee, Brendan Chee-Seng, Banking & Finance, Australian School of Business, UNSW January 2007 (has links)
We utilise several asset pricing models that allow for discontinuities in the returns and volatility time series in order to obtain estimates of Value-at-Risk (VaR). The first class of model that we use mixes a continuous diffusion process with discrete jumps at random points in time (Poisson Jump Diffusion Model). We also apply a purely discontinuous model that does not contain any continuous component at all in the underlying distribution (Variance Gamma Model). These models have been shown to have some success in capturing certain characteristics of return distributions, a few being leptokurtosis and skewness. Calibrating these models onto the returns of an index of Australian stocks (All Ordinaries Index), we then use the resulting parameters to obtain daily estimates of VaR. In order to obtain the VaR estimates for the Poisson Jump Diffusion Model and the Variance Gamma Model, we introduce the use of an innovation from option pricing techniques, which concentrates on the more tractable characteristic functions of the models. Having then obtained a series of VaR estimates, we then apply a variety of criteria to assess how each model performs and also evaluate these models against the traditional approaches to calculating VaR, such as that suggested by J.P. Morgan???s RiskMetrics. Our results show that whilst the Poisson Jump Diffusion model proved the most accurate at the 95% VaR level, neither the Poisson Jump Diffusion or Variance Gamma models were dominant in the other performance criteria examined. Overall, no model was clearly superior according to all the performance criteria analysed, and it seems that the extra computational time required to calibrate the Poisson Jump Diffusion and Variance Gamma models for the purposes of VaR estimation do not provide sufficient reward for the additional effort than that currently employed by Riskmetrics.
|
197 |
Die Portefeuilleoptimierung im Eigenhandel von Kreditinstituten : eine Analyse ausgewählter Organisationsformen unter Berücksichtigung value-at-risk-basierter Limite /Reckers, Thomas. January 2006 (has links)
Zugl.: Hagen, FernUniversity, Diss., 2006.
|
198 |
Mehrperiodige ALM-Modelle mit CVaR-Minimierung für Schweizer Pensionskassen /Künzi-Bay, Alexandra. January 2007 (has links) (PDF)
Univ., Diss.--Zürich, 2007. / ALM = Asset- und Liability Management. - CVaR = Conditional Value-at-Risk.
|
199 |
Parametrische Modelle zur Ermittlung des Value-at-Risk /Read, Oliver. January 1998 (has links)
Universiẗat, Diss.--Köln, 1998. / Literaturverz. S. 185-197.
|
200 |
Wertorientiertes Risikomanagement in Banken : Analyse der Wertrelevanz und Implikationen für Theorie und Praxis /Strauss, Michael. January 2008 (has links)
Zugl.: Marburg, Universiẗat, Diss., 2008.
|
Page generated in 0.0519 seconds