Spelling suggestions: "subject:"extreme value"" "subject:"extreme alue""
61 |
Extreme value theory : from a financial risk management perspectiveBaldwin, Sheena 03 1900 (has links)
Thesis (MBA)--Stellenbosch University, 2004. / ENGLISH ABSTRACT: Risk managers and regulators are primarily concerned with ensuring that there is sufficient
capital to withstand the effects of adverse movements in market prices. The accurate prediction
of the maximum amount that a financial institution can expect to Jose over a specified
period is essential to guard against catastrophic losses that can threaten the viability of an
individual finn or the stability of entire markets.
Value-at-risk (VaR) is a quantile-based measure of risk that is widely used for calculating the
capital adequacy requirements of banks and other financial institutions. However, the current
models for price risk tend to underestimate the risk of catastrophic losses because the entire
return distribution is used to calculate the value-at-risk. By contrast, Extreme Value" Theory
uses only the largest observations to model the tails of a distribution, which should provide a
better fit for estimates of extreme quantiles and probabilities.
The semi-parametric Hill (1975) estimator has often been used to fit the tails of financial
returns, but its performance is heavily dependent on the number k" of order statistics used in
the estimation process and the estimator can be very biased if this choice is suboptimal.
Since k" depends on unknown properties of the tail, it has to be estimated from the sample.
The first truly data-driven method for choosing an optimal number of order statistics
adaptively was introduced by Beirlant, Dierckx. Goegebeur and Matthys (1999) and modified
by Beirlanl. Dierckx and Stmca (2000) and Matthys and Beirlanl (2000b). Their methods are
based on an exponential regression model developed independently by Beirlant et a/. (1999)
and Feuerverger and Hall (1999) to reduce the bias found in the Hill estimator.
The reduced bias of these adaptive estimators and the associated estimator for extreme
quantiles developed by Matthys and Beirlant (2000b) makes these estimators attractive from a
risk management point of view, but more work needs to be done on characterising their finite
sample properties before they can be used in practice. In particular, it is crucially important to
establish the smallest sample size that will yield reliable estimates of extreme quantiles and
probabilities and to determine the widths and coverage probabilities of confidence intervals.
This study project reviews the probability and statistical theory of univariate Extreme Value
Theory from a financial risk management perspective. It is clear from a survey of the
literature that the most worthwhile direction to pursue in terms of practical research will be
intimately connected with developments in the fast-moving field of EVT with a future
emphasis not only on fully evaluating the existing models, but indeed on creating even less
biased and more precise models.
Keywords and phrases: Extreme value index, Pareto-type distributions, maximum likelihood
estimation, bias reduction, exponential regression model, market risk. / AFRIKAANSE OPSOMMING: Risikobestuurders en -reguleerders is hoofsaaklik gemoeid met die versekering dat
genoegsame kapitaal beskikbaar is om die effek van ongunstige beweging in markpryse
die hoof te kan bied. Die akkurate vooruitskatting van die maksimum verlies wat 'n
finansiele instelling oor 'n spesifieke tydperk kan ly, is noodsaaklik as beskerming teen
katastrofiese verliese wat die voortbestaan van 'n individuele firma, of die stabiliteit van
die totale mark, mag bedreig.
Waarde-op-Risiko (WoR) is 'n kwantiel gebaseerde maatstaaf van risiko wat algemeen
vir die berekening van kapitaaltoereikendheid van banke en ander finansiele instellings
benut word. Die huidige prys risikomodelle neig om die risiko van katastrofiese verliese
te onderskat, omdat die totale opbrengs verspreiding gebruik word om WoR te bereken.
In teenstelling benut die Ekstreme Waarde Teorie (EWT), slegs die grootste waarnemings
om die eindverdelings te modelleer en is as sulks meer geskik om ekstreme kwantiele en
waarskynlikhede te bepaal.
Die semi-parametriese Hill (1975) skatter word gereeld gebruik om die stertgedeeltes van
finansiele opbrengste te beraam, maar sy verrigting is swaar afhanklik van die getal k~
van rangstatistieke wat in die skattingsproses gebruik word en die skatting kan baie sydig
wees indien die keuse suboptimaal is.
Weens die afhanklikheid van kn van onbekende eienskappe van die stertgedeeltes, moet
dit geskat word vanuit die steekproefdata. Die eerste data-gedrewe metode vir die keuse
van die optimale rangordestatistieke, is deur Beiriant, Dierckx, Goegebeur en Matthys
(1999) ontwikkel en aangepas deur Beirlant, Dierckx and Starica (2000), asook Matthys
en Beirlant (2000b). Hul metodes is op 'n eksponensiele regressiemodel gebaseer, en is
onafhanklik deur Beirlant et at. (1999), en Feuerverger en Hall (1999) ontwikkel met die
doel om die sydigheid van die Hill skatter te verminder.
Die verminderde sydigheid van hierdie adaptiewe skatters en die verwante skatter vir
ekstreme kwantiele, ontwikkel deur Matthys en Beirlant (2000b), maak hierdie skatters
aantreklik vanuit 'n risikobestuur oogpunt, maar meer werk word benodig met die
karakterisering van hul eindige steekproefeienskappe, alvorens dit in die praktyk benut
kan word. In besonder is dit van uiterste belang dat die kleinste steekproefgrootte bepaal
sal word wat die betroubare skattings van ekstreme kwantiele en moontlikhede sal
verseker, en wat ook benut kan word om betroubaarheidsintervalle op te ste!.
Hierdie studie bied 'n oorsig van die moontlikhede en statistiese teorie van die
eenveranderlike EWT vanuit 'n finansiele risikobestuur perspektief. Dit is duidelik
vanuit die literatuurstudie dat die mees nuttige rigting om voort te gaan met praktiese
navorsing, verband hou met die ontwikkeling in die vinnig ontwikkelende veld van EWT
met toekomstige fokus, nie slegs op die volle evaluering van die bestaande modelle nie,
maar ook op die ontwikkeling van minder sydige en meer akkurate modelle.
|
62 |
Severe Weather during the North American Monsoon and Its Response to Rapid Urbanization and a Changing Global Climate within the Context of High Resolution Regional Atmospheric ModelingLuong, Thang Manh January 2015 (has links)
The North American monsoon (NAM) is the principal driver of summer severe weather in the Southwest U.S. With sufficient atmospheric instability and moisture, monsoon convection initiates during daytime in the mountains and later may organize, principally into mesoscale convective systems (MCSs). Most monsoon-related severe weather occurs in association with organized convection, including microbursts, dust storms, flash flooding and lightning. The overarching theme of this dissertation research is to investigate simulation of monsoon severe weather due to organized convection within the use of regional atmospheric modeling. A commonly used cumulus parameterization scheme has been modified to better account for dynamic pressure effects, resulting in an improved representation of a simulated MCS during the North American monsoon experiment and the climatology of warm season precipitation in a long-term regional climate model simulation. The effect of urbanization on organized convection occurring in Phoenix is evaluated in model sensitivity experiments using an urban canopy model (UCM) and urban land cover compared to pre-settlement natural desert land cover. The presence of vegetation and irrigation makes Phoenix a "heat sink" in comparison to its surrounding desert, and as a result the modeled precipitation in response to urbanization decreases within the Phoenix urban area and increase on its periphery. Finally, analysis of how monsoon severe weather is changing in association with observed global climate change is considered within the context of a series of retrospectively simulated severe weather events during the period 1948-2010 in a numerical weather prediction paradigm. The individual severe weather events are identified by favorable thermodynamic conditions of instability and atmospheric moisture (precipitable water). Changes in precipitation extremes are evaluated with extreme value statistics. During the last several decades, there has been intensification of organized convective precipitation, but these events occur with less frequency. A more favorable thermodynamic environment for monsoon thunderstorms is the driver of these changes, which is consistent with the broader notion that anthropogenic climate change is presently intensifying weather extremes worldwide.
|
63 |
Markov chains for genetics and extremesSisson, Scott Antony January 2001 (has links)
No description available.
|
64 |
Fitting extreme value distributions to the Zambezi river flood water levels recorded at Katima Mulilo in Namibia.Kamwi, Innocent Silibelo January 2005 (has links)
The aim of this research project was to estimate parameters for the distribution of annual maximum flood levels for the Zambezi River at Katima Mulilo. The estimation of parameters was done by using the maximum likelihood method. The study aimed to explore data of the Zambezi's annual maximum flood heights at Katima Mulilo by means of fitting the Gumbel, Weibull and the generalized extreme value distributions and evaluated their goodness of fit.
|
65 |
How Low Can You Go? : Quantitative Risk Measures in Commodity MarketsForsgren, Johan January 2016 (has links)
The volatility model approach to forecasting Value at Risk is complemented with modelling of Expected Shortfalls using an extreme value approach. Using three models from the GARCH family (GARCH, EGARCH and GJR-GARCH) and assuming two conditional distributions, normal Gaussian and Student t’s distribution, to make predictions of VaR, the forecasts are used as a threshold for assigning losses to the distribution tail. The Expected Shortfalls are estimated assuming that the violations of VaR follow the Generalized Pareto distribution, and the estimates are evaluated. The results indicate that the most efficient model for making predictions of VaR is the asymmetric GJR-GARCH, and that assuming the t distribution generates conservative forecasts. In conclusion there is evidence that the commodities are characterized by asymmetry and conditional normality. Since no comparison is made, the EVT approach can not be deemed to be either superior or inferior to standard approaches to Expected Shortfall modeling, although the data intensity of the method suggest that a standard approach may be preferable.
|
66 |
Order-statistics-based inferences for censored lifetime data and financial risk analysisSheng, Zhuo January 2013 (has links)
This thesis focuses on applying order-statistics-based inferences on lifetime analysis and financial risk measurement. The first problem is raised from fitting the Weibull distribution to progressively censored and accelerated life-test data. A new orderstatistics- based inference is proposed for both parameter and con dence interval estimation. The second problem can be summarised as adopting the inference used in the first problem for fitting the generalised Pareto distribution, especially when sample size is small. With some modifications, the proposed inference is compared with classical methods and several relatively new methods emerged from recent literature. The third problem studies a distribution free approach for forecasting financial volatility, which is essentially the standard deviation of financial returns. Classical models of this approach use the interval between two symmetric extreme quantiles of the return distribution as a proxy of volatility. Two new models are proposed, which use intervals of expected shortfalls and expectiles, instead of interval of quantiles. Different models are compared with empirical stock indices data. Finally, attentions are drawn towards the heteroskedasticity quantile regression. The proposed joint modelling approach, which makes use of the parametric link between the quantile regression and the asymmetric Laplace distribution, can provide estimations of the regression quantile and of the log linear heteroskedastic scale simultaneously. Furthermore, the use of the expectation of the check function as a measure of quantile deviation is discussed.
|
67 |
Analysis and processing of mechanically stimulated electrical signals for the identification of deformation in brittle materialsKyriazis, Panagiotis A. January 2010 (has links)
The fracture of brittle materials is of utmost importance for civil engineering and seismology applications. A different approach towards the aim of early identification of fracture and the prediction of failure before it occurs is attempted in this work. Laboratory experiments were conducted in a variety of rock and cement based material specimens of various shapes and sizes. The applied loading schemes were cyclic or increasing and the specimens were tested to compression and bending type loading of various levels. The techniques of Pressure Stimulated Current and Bending Stimulated Current were used for the detection of electric signal emissions during the various deformation stages of the specimens. The detected signals were analysed macroscopically and microscopically so as to find suitable criteria for fracture prediction and correlation between the electrical and mechanical parameters. The macroscopic proportionality of the mechanically stimulated electric signal and the strain was experimentally verified, the macroscopic trends of the PSC and BSC electric signals were modelled and the effects of material memory to the electric signals were examined. The current of a time-varying RLC electric circuit was tested against experimental data with satisfactory results and it was proposed as an electrical equivalent model. Wavelet based analysis of the signal revealed the correlation between the frequency components of the electric signal and the deformation stages of the material samples. Especially the increase of the high frequency component of the electric signal seems to be a good precursor of macrocracking initiation point. The additional electric stimulus of a dc voltage application seems to boost the frequency content of the signal and reveals better the stages of cracking process. The microscopic analysis method is scale-free and thus it can confront with the problems of size effects and material properties effects. The AC conductivity time series of fractured and pristine specimens were also analysed by means of wavelet transform and the spectral analysis was used to differentiate between the specimens. A non-destructive technique may be based on these results. Analysis has shown that the electric signal perturbation is an indicator of the forthcoming fracture, as well as of the fracture that has already occurred in specimens.
|
68 |
FITTING A DISTRIBUTION TO CATASTROPHIC EVENTOsei, Ebenezer 15 December 2010 (has links)
Statistics is a branch of mathematics which is heavily employed in the area of Actuarial Mathematics. This thesis first reviews the importance of statistical distributions in the analysis of insurance problems and the applications of Statistics in the area of risk and insurance. The Normal, Log-normal, Pareto, Gamma, standard Beta, Frechet, Gumbel, Weibull, Poisson, binomial, and negative binomial distributions are looked at and the importance of these distributions in general insurance is also emphasized. A careful review of literature is to provide practitioners in the general insurance industry with statistical tools which are of immediate application in the industry. These tools include estimation methods and fit statistics popular in the insurance industry. Finally this thesis carries out the task of fitting statistical distributions to the flood loss data in the 50 States of the United States.
|
69 |
Measuring Extremes: Empirical Application on European MarketsÖztürk, Durmuş January 2015 (has links)
This study employs Extreme Value Theory and several univariate methods to compare their Value-at-Risk and Expected Shortfall predictive performance. We conduct several out-of-sample backtesting procedures, such as uncondi- tional coverage, independence and conditional coverage tests. The dataset in- cludes five different stock markets, PX50 (Prague, Czech Republic), BIST100 (Istanbul, Turkey), ATHEX (Athens, Greece), PSI20 (Lisbon, Portugal) and IBEX35 (Madrid, Spain). These markets have different financial histories and data span over twenty years. We analyze the global financial crisis period sep- arately to inspect the performance of these methods during the high volatility period. Our results support the most common findings that Extreme Value Theory is one of the most appropriate risk measurement tools. In addition, we find that GARCH family of methods, after accounting for asymmetry and fat tail phenomena, can be equally useful and sometimes even better than Extreme Value Theory based method in terms of risk estimation. Keywords Extreme Value Theory, Value-at-Risk, Expected Shortfall, Out-of-Sample Backtesting Author's e-mail ozturkdurmus@windowslive.com Supervisor's e-mail ies.avdulaj@gmail.com
|
70 |
Discrete Parameter Estimation for Rare Events: From Binomial to Extreme Value DistributionsSchneider, Laura Fee 26 April 2019 (has links)
No description available.
|
Page generated in 0.0572 seconds