Spelling suggestions: "subject:"extreme value"" "subject:"extreme alue""
121 |
Development of value at risk measures : towards an extreme value approachGanief, Moegamad Shahiem 12 1900 (has links)
Thesis (MBA)--Stellenbosch University, 2001. / ENGLISH ABSTRACT: Commercial banks, investment banks, insurance companies, non-financial firms, and
pension funds hold portfolios of assets that may include stocks, bonds, currencies,
and derivatives. Each institution needs to quantify the amount of risk its portfolio is
exposed to in the course of a day, week, month, or year. Extreme events in financial
markets, such as the stock market crash of October 1987, are central issues in finance
and particularly in risk management and financial regulation.
A method called value at risk (VaR) can be used to estimate market risk. Value at risk
is a powerful measure of risk that is gaining wide acceptance amongst institutions for
the management of market risk. Value at Risk is an estimate of the largest lost that
a portfolio is likely to suffer during all but truly exceptional periods. More precisely,
the VaR is the maximum loss that an institution can be confident it would lose a
certain fraction of the time over a particular period.
The power of the concept is its generality. VaR measures are applicable to entire
portfolios - encompassing many asset categories and multiple sources of risk. As with
its power, the challenge of calculating VaR also stems from its generality. In order to
measure risk in a portfolio using VaR, some means must be found for determining a
return distribution for the portfolio.
There exists a wide range of literature on different methods of implementing VaR.
But, when one attempts to apply the results, several questions remain open. For
example, given a VaR measure, how can the risk manager test that the particular
measure at hand is appropriately specified? And secondly, given two different VaR
measures, how can the risk manager pick the best measure?
Despite the popularity of VaR for measuring market risk, no consensus has yet been reach as to the best method to implement this risk measure. The absence of consensus
is in part derived from the realization that each method currently in use has some
significant drawbacks.
The aim of this project is threefold: to introduce the reader to the concept of VaR;
present the theoretical basis for the general approaches to VaR computations; and to
introduce and apply Extreme Value Theory to VaR calculations.
The general approaches to VaR computation falls into three categories, namely, Analytic
(Parametric) Approach, Historical Simulation Approach, and Monte Carlo Simulation
Approach. Each of these approaches has its strengths and weaknesses, which
will study more closely.
The extreme value approach to VaR calculation is a relatively new approach. Since
most observed returns are central ones, traditional VaR methods tend to ignore extreme
events and focus on risk measures that accommodate the whole empirical distribution
of central returns. The danger of this approach is that these models are prone
to fail just when they are needed most - in large market moves, when institutions can
suffer very large losses.
The extreme value approach is a tool that attempts to provide the user with the best
possible estimate of the tail area of the distribution. Even in the absence of useful
historical data, extreme value theory provides guidance on the kind of distribution
that should be selected so that extreme risks are handled conservatively. As an
illustration, the extreme value method will be applied to a foreign exchange futures
contract. The validity of EVT to VaR calculations will be tested by examining the
data of the Rand/Dollar One Year Futures Contracts. An extended worked example
will be provided wherein which attempts to highlight the considerable strengths of
the methods as well as the pitfalls and limitations. These results will be compared to
VaR measures calculated using a GARCH(l,l) model. / AFRIKAANSE OPSOMMING: Handelsbanke, aksepbanke, assuransiemaatskappye, nie-finansiële instellings en pensioenfondse
beskik oor portefeuljes van finansiële bates soos aandele, effekte, geldeenhede
en afgeleides. Elke instelling moet die omvang kan bepaal van die risiko waaraan
die portefeulje blootgestel is in die loop van 'n dag, week, maand of jaar. Uitsonderlike
gebeure op finansiële markte, soos die ineenstorting van die aandelemark in Oktober
1987, is van besondere belang vir finansies en veral vir risikobestuur en finansiële
regulering.
'n Metode wat genoem word Waarde op Risiko (WoR), kan gebruik word om markverliese
te meet. WoR is 'n kragtige maatstaf vir risiko en word deur vele instellings gebruik
vir die bestuur van mark-risiko. Waarde op Risiko is 'n raming van die grootste
verlies wat 'n portefeulje moontlik kan ly gedurende enige tydperk, met uitsluiting
van werklik uitsonderlike tydperke. Van nader beskou, is WoR die maksimum verlies
wat 'n instelling kan verwag om gedurende 'n sekere tydperk binne 'n bepaalde
periode te ly.
Die waarde van die konsep lê in die algemene aard daarvan. WoR metings is van
toepassing op portefeuljes in dié geheel en dit omvat baie kategorieë bates en veelvuldige
bronne van risiko. Soos met die waarde van die konsep, hou die uitdaging om WoR
te bereken ook verband met die algemene aard van die konsep. Ten einde die risiko
te bepaal in 'n portefeulje waar WoR gebruik word, moet metodes gevind word waarvolgens
'n opbrengsverdeling vir die portefeulje vasgestel kan word.
Daar bestaan 'n groot verskeidenheid literatuur oor die verskillende metodes om WoR
te implementeer. Wanneer dit egter kom by die toepassing van die resultate, bly
verskeie vrae onbeantwoord. Byvoorbeeld, hoe kan die risikobestuurder aan die hand
van 'n gegewe WoR-maatstaf toets of die spesifieke maatstaf reg gespesifiseer is? Tweedens, hoe kan die risikobestuurder die beste maatstaf kies in die geval van twee
verskillende WoR-maatstawwe?
Ondanks die feit dat WoR algemeen gebruik word vir die meting van markrisiko, is
daar nog nie konsensus bereik oor die beste metode om hierdie benadering tot risikometing
te implementeer nie. Die feit dat daar nie konsensus bestaan nie, kan deels
daaraan toegeskryf word dat elkeen van die metodes wat tans gebruik word, ernstige
leemtes het.
Die doel van hierdie projek is om die konsep WoR bekend te stel, om die teoretiese
grondslag te lê vir die algemene benadering tot die berekening van WoR en om die
Ekstreme Waarde-teorie bekend te stel en toe te pas op WoR-berekenings.
Die algemene benadering tot die berekening van WoR word in drie kategorieë verdeel
naamlik die Analitiese (Parametriese) benadering, die Historiese simulasiebenadering
en die Monte Carlo-simulasiebenadering. Elkeen van die benaderings het sterk- en
swakpunte wat van nader ondersoek sal word.
Die Ekstreme Waarde-benadering tot WoR is 'n relatief nuwe benadering. Aangesien
die meeste opbrengste middelwaarde-gesentreer is, is tradisionele WoR-metodes
geneig om uitsonderlike gebeure buite rekening te laat en te fokus op risiko-maatstawwe
wat die hele empiriese verdeling van middelwaarde-gesentreerde opbrengste akkommodeer.
Die gevaar bestaan dan dat hierdie modelle geneig is om te faal juis wanneer
dit die meeste benodig word, byvoorbeeld in die geval van groot markverskuiwings
waartydens organisasies baie groot verliese kan ly.
Daar word beoog om met behulp van die Ekstreme Waarde-benadering aan die gebruiker
die beste moontlike skatting van die stert-area van die verdeling te gee. Selfs
in die afwesigheid van bruikbare historiese data verskaf die Ekstreme Waarde-teorie
riglyne ten opsigte van die aard van die verdeling wat gekies moet word, sodat uiterste
risiko's versigtig hanteer kan word. Ten einde hierdie metode te illustreer, word
dit in hierdie studie toegepas op 'n termynkontrak ten opsigte van buitelandse wisselkoerse.
Die geldigheid van die Ekstreme Waarde-teorie ten opsigte van WoR berekenings
word getoets deur die data van die Rand/Dollar Eenjaartermynkontrak
te bestudeer. 'n Volledig uitgewerkte voorbeeld word verskaf waarin die slaggate en
beperkings asook die talle sterkpunte van die model uitgewys word. Hierdie resultate
sal vergelyk word met 'n WoR-meting wat bereken is met die GARCH (1,1) model.
|
122 |
'n Ondersoek na die eindige steekproefgedrag van inferensiemetodes in ekstreemwaarde-teorieVan Deventer, Dewald 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2005. / Extremes are unusual or rare events. However, when such events – for example
earthquakes, tidal waves and market crashes - do take place, they typically cause
enormous losses, both in terms of human lives and monetary value. For this reason,
it is of critical importance to accurately model extremal events. Extreme value theory
entails the development of statistical models and techniques in order to describe and
model such rare observations.
In this document we discuss aspects of extreme value theory. This theory consists of
two approaches: The classical maxima method, based on the properties of the
maximum of a sample and the more popular threshold theory, based upon the
properties of exceedances of a specified threshold value. This document provides
the practitioner with the theoretical and practical tools for both these approaches.
This will enable him/her to perform extreme value analyses with confidence.
Extreme value theory – for both approaches - is based upon asymptotic arguments.
For finite samples, the limiting result for the sample maximum holds approximately
only. Similarly, for finite choices of the threshold, the limiting distribution for
exceedances of that threshold holds only approximately. In this document we
investigate the quality of extreme value based inferences with regard to the unknown
underlying distribution when the sample size or threshold is finite. Estimation of
extreme tail quantiles of the underlying distribution, as well as the calculation of
confidence intervals, are typically the most important objectives of an extreme
analysis. For that reason, we evaluate the accuracy of extreme based inferences in
terms of these estimates. This investigation was carried out using a simulation study,
performed with the software package S-Plus.
|
123 |
Stability of the Financial System: Systemic Dependencies between Bank and Insurance Sectors / Stability of the Financial System: Systemic Dependencies between Bank and Insurance SectorsProcházková, Jana January 2014 (has links)
The central issue of this thesis is investigating the eventuality of systemic break- downs in the international financial system through examining systemic depen- dence between bank and insurance sectors. Standard models of systemic risk often use correlation of stock returns to evaluate the magnitude of intercon- nectedness between financial institutions. One of the main drawbacks of this approach is that it is oriented towards observations occurring along the central part of the distribution and it does not capture the dependence structure of outlying observations. To account for that, we use methodology which builds on the Extreme Value Theory and is solely focused on capturing dependence in extremes. The analysis is performed using the data on stock prices of the EU largest banks and insurance companies. We study dependencies in the pre- crisis and post-crisis period. The objective is to discover which sector poses a higher systemic threat to the international financial stability. Also, we try to find empirical evidence about an increase in interconnections in recent post- crisis years. We find that in both examined periods systemic dependence in the banking sector is higher than in the insurance sector. Our results also in- dicate that extremal interconnections in the respective sectors increased,...
|
124 |
[en] EXTREME VALUE THEORY: VALUE AT RISK FOR VARIABLE-INCOME ASSETS / [pt] TEORIA DOS VALORES EXTREMOS: VALOR EM RISCO PARA ATIVOS DE RENDA VARIÁVELGUSTAVO LOURENÇO GOMES PIRES 26 June 2008 (has links)
[pt] A partir da década de 90, a metodologia de Valor em Risco
(VaR) se difundiu pelo mundo, tanto em instituições
financeiras quanto em não financeiras, como uma boa prática
de mensuração de riscos. Um dos fatos estilizados mais
pronunciados acerca das distribuições de retornos
financeiros diz respeito à presença de caudas pesadas. Isso
torna os modelos paramétricos tradicionais de
cálculo de Valor em Risco (VaR) inadequados para a estimação
de VaR de baixas probabilidades, dado que estes se baseiam
na hipótese de normalidade para as distribuições dos
retornos. Sendo assim, o objetivo do presente trabalho é
investigar o desempenho de modelos baseados na Teoria dos
Valores Extremos para o cálculo do VaR. Os resultados
indicam que os modelos baseados na Teoria dos Valores
Extremos são adequados para a modelagem das caudas, e
consequentemente para a estimação de Valor em Risco quando
os níveis de probabilidade de interesse são baixos. / [en] Since the 90 decade, the use of Value at Risk (VaR)
methodology has been disseminated among both financial and
non-financial institutions around the world, as a good
practice in terms of risks management. The existence of fat
tails is one of the striking stylized facts of financial
returns distributions. This fact makes the use of
traditional parametric models for Value at Risk (VaR)
estimation unsuitable for the estimation of low probability
events. This is because traditional models are based on the
conditional normality assumption for financial returns
distributions. The main purpose of this dissertation is to
investigate the performance of VaR models based on Extreme
Value Theory. The results indicates that Extreme Value
Theory based models are suitable for low probability
VaR estimation.
|
125 |
Local Likelihood Approach for High-Dimensional Peaks-Over-Threshold InferenceBaki, Zhuldyzay 14 May 2018 (has links)
Global warming is affecting the Earth climate year by year, the biggest difference being observable in increasing temperatures in the World Ocean. Following the long- term global ocean warming trend, average sea surface temperatures across the global tropics and subtropics have increased by 0.4–1◦C in the last 40 years. These rates become even higher in semi-enclosed southern seas, such as the Red Sea, threaten- ing the survival of thermal-sensitive species. As average sea surface temperatures are projected to continue to rise, careful study of future developments of extreme temper- atures is paramount for the sustainability of marine ecosystem and biodiversity. In this thesis, we use Extreme-Value Theory to study sea surface temperature extremes from a gridded dataset comprising 16703 locations over the Red Sea. The data were provided by Operational SST and Sea Ice Analysis (OSTIA), a satellite-based data system designed for numerical weather prediction. After pre-processing the data to account for seasonality and global trends, we analyze the marginal distribution of ex- tremes, defined as observations exceeding a high spatially varying threshold, using the Generalized Pareto distribution. This model allows us to extrapolate beyond the ob- served data to compute the 100-year return levels over the entire Red Sea, confirming the increasing trend of extreme temperatures. To understand the dynamics govern- ing the dependence of extreme temperatures in the Red Sea, we propose a flexible local approach based on R-Pareto processes, which extend the univariate Generalized Pareto distribution to the spatial setting. Assuming that the sea surface temperature varies smoothly over space, we perform inference based on the gradient score method
over small regional neighborhoods, in which the data are assumed to be stationary in space. This approach allows us to capture spatial non-stationarity, and to reduce the overall computational cost by taking advantage of distributed computing resources. Our results reveal an interesting extremal spatial dependence structure: in particular, from our estimated model, we conclude that significant extremal dependence prevails for distances up to about 2500 km, which roughly corresponds to the Red Sea length.
|
126 |
Application of Scientific Computing and Statistical Analysis to address Coastal Hazards / Application du Calcul Scientifique et de l'Analyse Statistique à la Gestion du Risque en Milieu LittoralChailan, Romain 23 November 2015 (has links)
L'étude et la gestion des risques littoraux sont plébiscitées par notre société au vu des enjeux économiques et écologiques qui y sont impliqués. Ces risques sont généralement réponse à des conditions environnementales extrêmes. L'étude de ces phénomènes physiques repose sur la compréhension de ces conditions rarement (voire nullement) observées.Dans un milieu littoral, la principale source d'énergie physique est véhiculée par les vagues. Cette énergie est responsable des risques littoraux comme l'érosion et la submersion qui évoluent à des échelles de temps différentes (événementielle ou long-terme). Le travail réalisé, situé à l'interface de l'analyse statistique, de la géophysique et de l'informatique, vise à apporter des méthodologies et outils aux décideurs en charge de la gestion de tels risques.En pratique, nous nous intéressons à mettre en place des méthodes qui prennent en compte non seulement un site ponctuel mais traitent les problématiques de façon spatiale. Ce besoin provient de la nature même des phénomènes environnementaux qui sont spatiaux, tels les champs de vagues.L'étude des réalisations extrêmes de ces processus repose sur la disponibilité d'un jeu de données représentatif à la fois dans l'espace et dans le temps, permettant de projeter l'information au-delà de ce qui a déjà été observé. Dans le cas particulier des champs de vagues, nous avons recours à la simulation numérique sur calculateur haute performance (HPC) pour réaliser un tel jeu de données. Le résultat de ce premier travail offre de nombreuses possibilités d'applications.En particulier, nous proposons à partir de ce jeu de données deux méthodologies statistiques qui ont pour but respectif de répondre aux problématiques de risques littoraux long-termes (érosion) et à celles relatives aux risques événementiels (submersion). La première s'appuie sur l'application de modèles stochastiques dit max-stables, particulièrement adapté à l'étude des événements extrêmes. En plus de l'information marginale, ces modèles permettent de prendre en compte la structure de dépendance spatiale des valeurs extrêmes. Nos résultats montrent l'intérêt de cette méthode au devant de la négligence de la dépendance spatiale de ces phénomènes pour le calcul d'indices de risque.La seconde approche est une méthode semi-paramétrique dont le but est de simuler des champs spatio-temporels d'états-de-mer extrêmes. Ces champs, interprétés comme des tempêtes, sont des amplifications contrôlées et bi-variés d'épisodes extrêmes déjà observés. Ils forment donc des tempêtes encore plus extrêmes. Les tempêtes simulées à une intensité contrôlée alimentent des modèles physiques événementiels à la côte, permettant d'aider les décideurs à l'anticipation de ces risques encore non observés.Enfin et depuis la construction de ces scenarii extrêmes, nous abordons la notion de pré-calcul dans le but d'apporter en quasi-temps réel au décideur et en tant de crise une prévision sur le risque littoral.L’ensemble de ce travail s'inscrit dans le cadre d'un besoin industriel d’aide à la modélisation physique : chainage de modèles numériques et statistiques. La dimension industrielle de cette thèse est largement consacrée à la conception et au développement d’un prototype de plateforme de modélisation permettant l’utilisation systématique d’un calculateur HPC pour les simulations et le chainage de modèles de façon générique.Autour de problématiques liées à la gestion du risque littoral, cette thèse démontre l'apport d'un travail de recherche à l'interface de plusieurs disciplines. Elle y répond en conciliant et proposant des méthodes de pointe prenant racine dans chacune de ces disciplines. / Studies and management of coastal hazards are of high concerns in our society, since they engage highly valuable economical and ecological stakes. Coastal hazards are generally responding to extreme environmental conditions. The study of these physical phenomena relies on the understanding of such environmental conditions, which are rarely (or even never) observed.In coastal areas, waves are the main source of energy. This energy is responsible of coastal hazards developed at different time-scales, like the submersion or the erosion.The presented work, taking place at the interface between Statistical Analysis, Geophysics and Computer Sciences, aiming at bringing forward tools and methods serving decision makers in charge of the management of such risks.In practice, the proposed solutions answer to the questionings with a consideration of the space dimension rather than only punctual aspects. This approach is more natural considering that environmental phenomena are generally spatial, as the sea-waves fields.The study of extreme realisations of such processes is based on the availability of a representative data set, both in time and space dimensions, allowing to extrapolating information beyond the actual observations. In particular for sea-waves fields, we use numerical simulation on high performance computational clusters (HPC) to product such a data set. The outcome of this work offers many application possibilities.Most notably, we propose from this data set two statistical methodologies, having respective goals of dealing with littoral hazards long-terms questionings (e.g., erosion) and event-scale questionings (e.g., submersion).The first one is based on the application of stochastic models so-called max-stable models, particularly adapted to the study of extreme values in a spatial context. Indeed, additionally to the marginal information, max-stable models allow to take into account the spatial dependence structures of the observed extreme processes. Our results show the interest of this method against the ones neglecting the spatial dependence of these phenomena for risk indices computation.The second approach is a semi-parametric method aiming at simulating extreme waves space-time processes. Those processes, interpreted as storms, are controlled and bi-variate uplifting of already observed extreme episodes. In other words, we create most severe storms than the one already observed. These processes simulated at a controlled intensity may feed littoral physical models in order to describe a very extreme event in both space and time dimensions. They allow helping decision-makers in the anticipation of hazards not yet observed.Finally and from the construction of these extreme scenarios, we introduce a pre-computing paradigm in the goal of providing the decision-makers with a real-time and accurate information in case of a sudden coastal crisis, without performing any physical simulation.This work fits into a growing industrial demand of modelling help. Most notably a need related to the chaining of numerical and statistical models. Consequently, the industrial dimension of this PhD.~is mostly dedicated to the design and development of a prototype modelling platform. This platform aims at systematically using HPC resources to run simulations and easing the chaining of models.Embracing solutions towards questionings related to the management of coastal hazard, this thesis demonstrates the benefits of a research work placed at the interface between several domains. This thesis answers such questionings by providing end-users with cutting-edge methods stemming from each of those domains.
|
127 |
Spin-glass models and interdisciplinary applicationsZarinelli, Elia 13 January 2012 (has links) (PDF)
Le sujet principal de cette thèse est la physique des verres de spin. Les verres de spin ont été introduits au début des années 70 pour décrire alliages magnétiques diluées. Ils ont désormais été considerés pour comprendre le comportement de liquides sousrefroidis. Parmis les systèmes qui peuvent être décrits par le langage des systèmes desordonnés, on trouve les problèmes d'optimisation combinatoire. Dans la première partie de cette thèse, nous considérons les modèles de verre de spin avec intéraction de Kac pour investiguer la phase de basse température des liquides sous-refroidis. Dans les chapitres qui suivent, nous montrons comment certaines caractéristiques des modèles de verre de spin peuvent être obtenues à partir de résultats de la théorie des matrices aléatoires en connection avec la statistique des valeurs extrêmes. Dans la dernière partie de la thèse, nous considérons la connexion entre la théorie desverres de spin et la science computationnelle, et présentons un nouvel algorithme qui peut être appliqué à certains problèmes dans le domaine des finances.
|
128 |
Inferences for the Weibull parameters based on interval-censored data and its applicationHuang, Jinn-Long 19 June 2000 (has links)
In this article, we make inferences for the Weibull parameters and propose two test statistics for the comparison of two Weibull
distributions based on interval-censored data. However, the distributions of the two statistics are unknown and not easy to obtain, therefore a simulation study is necessary. An urn model
in the simulation of interval-censored data was proposed by Lee (1999) to select random intervals. Then we propose a simulation
procedure with urn model to obtain approximately the quantiles of the two statistics. We demonstrate an example in AIDS study to
illustrate how the tests can be applied to the infection time distributions of AIDS.
|
129 |
Extreme behavior and VaR of Short-term interest rate of TaiwanChiang, Ming-Chu 21 July 2008 (has links)
The current study empirically analyzes the extreme behavior and the impact of deregulation policies as well as financial turmoil on the extreme behavior of changes of Taiwan short term interest rate. A better knowledge of short-term interest rate properties, such as heavy tails, asymmetry, and uneven tail fatness between right and left tails, provide an insight to the extreme behavior of short-term interest rate as well as a more accurate estimation of interest risk. The predicting performances of filtered and unfiltered VaR (Value at risk) models are also examined to suggest the proper models for management of interest rate risk. By applying Extreme Value theory (EVT), tail behavior is analyzed and tested and the VaR based on parametric and non-parametric EVT models are calculated.The empirical findings show that, first, the distribution of change of rate are heavy-tailed indicating that the actual risk would be underestimated based on normality assumption. Second, the unconditional distribution is consistent with the heavier-tailed distributions such as ARCH process or Student¡¦t. Third, the right tail of distribution of change of rate are significantly heavier than the left one pointing out that the probabilities and magnitudes of rise in rate could be higher than those of drop in rate. Fourth, the amount of tail-fatness in tail of distribution of change of rate increase after 1999 and the vital factors to cause structural break in tail index are the interest rate policies taken by central bank of Taiwan instead of the deregulation policies in money market. Fifth, based on the two break points found in tail index of right and left tail, long sample of CP rates should not be treated as samples from a single distribution. Sixth, the dependent and heteroscedastic properties of data series should be considered in applying EVT to improve accuracy of VaR forecasts. Finally, EVT models predict VaR accurately before 2001 and the benchmark model, HS and GARCH, generally are superior to EVT models after 2001. Among EVT models, MRE and CHE are relative consistent and reliable in VaR prediction.
|
130 |
An assessment of uncertainties and limitations in simulating tropical cyclone climatology and future changesSuzuki-Parker, Asuka 04 May 2011 (has links)
The recent elevated North Atlantic hurricane activity has generated considerable interests in the interaction between tropical cyclones (TCs) and climate change. The possible connection between TCs and the changing climate has been indicated by observational studies based on historical TC records; they indicate emerging trends in TC frequency and intensity in some TC basins, but the detection of trends has been hotly debated due to TC track data issues. Dynamical climate modeling has also been applied to the problem, but brings its own set of limitations owing to limited model resolution and uncertainties.
The final goal of this study is to project the future changes of North Atlantic TC behavior with global warming for the next 50 years using the Nested Regional Climate Model (NRCM). Throughout the course of reaching this goal, various uncertainties and limitations in simulating TCs by the NRCM are identified and explored.
First we examine the TC tracking algorithm to detect and track simulated TCs from model output. The criteria and thresholds used in the tracking algorithm control the simulated TC climatology, making it difficult to objectively assess the model's ability in simulating TC climatology. Existing tracking algorithms used by previous studies are surveyed and it is found that the criteria and thresholds are very diverse. Sensitivity of varying criteria and thresholds in TC tracking algorithm to simulated TC climatology is very high, especially with the intensity and duration thresholds. It is found that the commonly used criteria may not be strict enough to filter out intense extratropical systems and hybrid systems. We propose that a better distinction between TCs and other low-pressure systems can be achieved by adding the Cyclone Phase technique.
Two sets of NRCM simulations are presented in this dissertation: One in the hindcasting mode, and the other with forcing from the Community Climate System Model (CCSM) to project into the future with global warming. Both of these simulations are assessed using the tracking algorithm with cyclone phase technique.
The NRCM is run in a hindcasting mode for the global tropics in order to assess its ability to simulate the current observed TC climatology. It is found that the NRCM is capable of capturing the general spatial and temporal distributions of TCs, but tends to overproduce TCs particularly in the Northwest Pacific. The overpredction of TCs is associated with the overall convective tendency in the model added with an outstanding theory of wave energy accumulation leading to TC genesis. On the other hand, TC frequency in the tropical North Atlantic is under predicted due to the lack of moist African Easterly Waves. The importance of high-resolution is shown with the additional simulation with two-way nesting.
The NRCM is then forced by the CCSM to project the future changes in North Atlantic TCs. An El Nino-like SST bias in the CCSM induced a high vertical wind shear in tropical North Atlantic, preventing TCs from forming in this region. A simple bias correction method is applied to remove this bias. The model projected an increase both in TC frequency and intensity owing to enhanced TC genesis in the main development region, where the model projects an increased favorability of large-scale environment for TC genesis. However, the model is not capable of explicitly simulating intense (Category 3-5) storms due to the limited model resolution. To extrapolate the prediction to intense storms, we propose a hybrid approach that combines the model results and a statistical modeling using extreme value theory. Specifically, the current observed TC intensity is statistically modeled with the General Pareto distribution, and the simulated intensity changes from the NRCM are applied to the statistical model to project the changes in intense storms. The results suggest that the occurrence of Category 5 storms may be increased by approximately 50% by 2055.
|
Page generated in 0.0529 seconds