• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 119
  • 27
  • 19
  • 13
  • 10
  • 9
  • 7
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 232
  • 232
  • 151
  • 61
  • 58
  • 41
  • 36
  • 32
  • 29
  • 27
  • 26
  • 24
  • 23
  • 23
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Brown-Resnick Processes: Analysis, Inference and Generalizations

Engelke, Sebastian 14 December 2012 (has links)
No description available.
92

Monte Carlo Simulation Based Response Estimation and Model Updating in Nonlinear Random Vibrations

Radhika, Bayya January 2012 (has links) (PDF)
The study of randomly excited nonlinear dynamical systems forms the focus of this thesis. We discuss two classes of problems: first, the characterization of nonlinear random response of the system before it comes into existence and, the second, assimilation of measured responses into the mathematical model of the system after the system comes into existence. The first class of problems constitutes forward problems while the latter belongs to the class of inverse problems. An outstanding feature of these problems is that they are almost always not amenable for exact solutions. We tackle in the present study these two classes of problems using Monte Carlo simulation tools in conjunction with Markov process theory, Bayesian model updating strategies, and particle filtering based dynamic state estimation methods. It is well recognized in literature that any successful application of Monte Carlo simulation methods to practical problems requires the simulation methods to be reinforced with effective means of controlling sampling variance. This can be achieved by incorporating any problem specific qualitative and (or) quantitative information that one might have about system behavior in formulating estimators for response quantities of interest. In the present thesis we outline two such approaches for variance reduction. The first of these approaches employs a substructuring scheme, which partitions the system states into two sets such that the probability distribution of the states in one of the sets conditioned on the other set become amenable for exact analytical solution. In the second approach, results from data based asymptotic extreme value analysis are employed to tackle problems of time variant reliability analysis and updating of this reliability. We exemplify in this thesis the proposed approaches for response estimation and model updating by considering wide ranging problems of interest in structural engineering, namely, nonlinear response and reliability analyses under stationary and (or) nonstationary random excitations, response sensitivity model updating, force identification, residual displacement analysis in instrumented inelastic structures under transient excitations, problems of dynamic state estimation in systems with local nonlinearities, and time variant reliability analysis and reliability model updating. We have organized the thesis into eight chapters and three appendices. A resume of contents of these chapters and appendices follows. In the first chapter we aim to provide an overview of mathematical tools which form the basis for investigations reported in the thesis. The starting point of the study is taken to be a set of coupled stochastic differential equations, which are obtained after discretizing spatial variables, typically, based on application of finite element methods. Accordingly, we provide a summary of the following topics: (a) Markov vector approach for characterizing time evolution of transition probability density functions, which includes the forward and backward Kolmogorov equations, (b) the equations governing the time evolution of response moments and first passage times, (c) numerical discretization of governing stochastic differential equation using Ito-Taylor’s expansion, (d) the partial differential equation governing the time evolution of transition probability density functions conditioned on measurements for the study of existing instrumented structures, (e) the time evolution of response moments conditioned on measurements based on governing equations in (d), and (f) functional recursions for evolution of multidimensional posterior probability density function and posterior filtering density function, when the time variable is also discretized. The objective of the description here is to provide an outline of the theoretical formulations within which the problems of response estimation and model updating are formulated in the subsequent chapters of the present thesis. We briefly state the class of problems, which are amenable for exact solutions. We also list in this chapter major text books, research monographs, and review papers relevant to the topics of nonlinear random vibration analysis and dynamic state estimation. In Chapter 2 we provide a review of literature on solutions of problems of response analysis and model updating in nonlinear dynamical systems. The main focus of the review is on Monte Carlo simulation based methods for tackling these problems. The review accordingly covers numerical methods for approximate solutions of Kolmogorov equations and associated moment equations, variance reduction in simulation based analysis of Markovian systems, dynamic state estimation methods based on Kalman filter and its variants, particle filtering, and variance reduction based on Rao-Blackwellization. In this review we chiefly cover papers that have contributed to the growth of the methodology. We also cover briefly, the efforts made in applying the ideas to structural engineering problems. Based on this review, we identify the problems of variance reduction using substructuring schemes and data based extreme value analysis and, their incorporation into response estimation and model updating strategies, as problems requiring further research attention. We also identify a range of problems where these tools could be applied. We consider the development of a sequential Monte Carlo scheme, which incorporates a substructuring strategy, for the analysis of nonlinear dynamical systems under random excitations in Chapter 3. The proposed substructuring ensures that a part of the system states conditioned on the remaining states becomes Gaussian distributed and is amenable for an exact analytical solution. The use of Monte Carlo simulations is subsequently limited for the analysis of the remaining system states. This clearly results in reduction in sampling variance since a part of the problem is tackled analytically in an exact manner. The successful performance of the proposed approach is illustrated by considering response analysis of a single degree of freedom nonlinear oscillator under random excitations. Arguments based on variance decomposition result and Rao-Blackwell theorems are presented to demonstrate that the proposed variance reduction indeed is effective. In Chapter 4, we modify the sequential Monte Carlo simulation strategy outlined in the preceding chapter to incorporate questions of dynamic state estimation when data on measured responses become available. Here too, the system states are partitioned into two groups such that the states in one group become Gaussian distributed when conditioned on the states in the other group. The conditioned Gaussian states are subsequently analyzed exactly using the Kalman filter and, this is interfaced with the analysis of the remaining states using sequential importance sampling based filtering strategy. The development of this combined Kalman and sequential importance sampling filtering method constitutes one of the novel elements of this study. The proposed strategy is validated by considering the problem of dynamic state estimation in linear single and multi-degree of freedom systems for which exact analytical solutions exist. In Chapter 5, we consider the application of the tools developed in Chapter 4 for a class of wide ranging problems in nonlinear random vibrations of existing systems. The nonlinear systems considered include single and multi-degree of freedom systems, systems with memoryless and hereditary nonlinearities, and stationary and nonstationary random excitations. The specific applications considered include nonlinear dynamic state estimation in systems with local nonlinearities, estimation of residual displacement in instrumented inelastic dynamical system under transient random excitations, response sensitivity model updating, and identification of transient seismic base motions based on measured responses in inelastic systems. Comparisons of solutions from the proposed substructuring scheme with corresponding results from direct application of particle filtering are made and a satisfactory mutual agreement is demonstrated. We consider next questions on time variant reliability analysis and corresponding model updating in Chapters 6 and 7, respectively. The research effort in these studies is focused on exploring the application of data based asymptotic extreme value analysis for problems on hand. Accordingly, we investigate reliability of nonlinear vibrating systems under stochastic excitations in Chapter 6 using a two-stage Monte Carlo simulation strategy. For systems with white noise excitation, the governing equations of motion are interpreted as a set of Ito stochastic differential equations. It is assumed that the probability distribution of the maximum over a specified time duration in the steady state response belongs to the basin of attraction of one of the classical asymptotic extreme value distributions. The first stage of the solution strategy consists of selection of the form of the extreme value distribution based on hypothesis testing, and, the next stage involves the estimation of parameters of the relevant extreme value distribution. Both these stages are implemented using data from limited Monte Carlo simulations of the system response. The proposed procedure is illustrated with examples of linear/nonlinear systems with single/multiple degrees of freedom driven by random excitations. The predictions from the proposed method are compared with the results from large scale Monte Carlo simulations, and also with the classical analytical results, when available, from the theory of out-crossing statistics. Applications of the proposed method for vibration data obtained from laboratory conditions are also discussed. In Chapter 7 we consider the problem of time variant reliability analysis of existing structures subjected to stationary random dynamic excitations. Here we assume that samples of dynamic response of the structure, under the action of external excitations, have been measured at a set of sparse points on the structure. The utilization of these measurements in updating reliability models, postulated prior to making any measurements, is considered. This is achieved by using dynamic state estimation methods which combine results from Markov process theory and Bayes’ theorem. The uncertainties present in measurements as well as in the postulated model for the structural behaviour are accounted for. The samples of external excitations are taken to emanate from known stochastic models and allowance is made for ability (or lack of it) to measure the applied excitations. The future reliability of the structure is modeled using expected structural response conditioned on all the measurements made. This expected response is shown to have a time varying mean and a random component that can be treated as being weakly stationary. For linear systems, an approximate analytical solution for the problem of reliability model updating is obtained by combining theories of discrete Kalman filter and level crossing statistics. For the case of nonlinear systems, the problem is tackled by combining particle filtering strategies with data based extreme value analysis. The possibility of using conditional simulation strategies, when applied external actions are measured, is also considered. The proposed procedures are exemplified by considering the reliability analysis of a few low dimensional dynamical systems based on synthetically generated measurement data. The performance of the procedures developed is also assessed based on limited amount of pertinent Monte Carlo simulations. A summary of the contributions made and a few suggestions for future work are presented in Chapter 8. The thesis also contains three appendices. Appendix A provides details of the order 1.5 strong Taylor scheme that is extensively employed at several places in the thesis. The formulary pertaining to the bootstrap and sequential importance sampling particle filters is provided in Appendix B. Some of the results on characterizing conditional probability density functions that have been used in the development of the combined Kalman and sequential importance sampling filter in Chapter 4 are elaborated in Appendix C.
93

Modeling Extreme Values / Modelování extrémních hodnot

Shykhmanter, Dmytro January 2013 (has links)
Modeling of extreme events is a challenging statistical task. Firstly, there is always a limit number of observations and secondly therefore no experience to back test the result. One way of estimating higher quantiles is to fit one of theoretical distributions to the data and extrapolate to the tail. The shortcoming of this approach is that the estimate of the tail is based on the observations in the center of distribution. Alternative approach to this problem is based on idea to split the data into two sub-populations and model body of the distribution separately from the tail. This methodology is applied to non-life insurance losses, where extremes are particularly important for risk management. Never the less, even this approach is not a conclusive solution of heavy tail modeling. In either case, estimated 99.5% percentiles have such high standard errors, that the their reliability is very low. On the other hand this approach is theoretically valid and deserves to be considered as one of the possible methods of extreme value analysis.
94

Rozdělení extrémních hodnot a jejich aplikace / Extreme Value Distributions with Applications

Fusek, Michal January 2013 (has links)
The thesis is focused on extreme value distributions and their applications. Firstly, basics of the extreme value theory for one-dimensional observations are summarized. Using the limit theorem for distribution of maximum, three extreme value distributions (Gumbel, Fréchet, Weibull) are introduced and their domains of attraction are described. Two models for parametric functions estimation based on the generalized extreme value distribution (block maxima model) and the generalized Pareto distribution (threshold model) are introduced. Parameters estimates of these distributions are derived using the method of maximum likelihood and the probability weighted moment method. Described methods are used for analysis of the rainfall data in the Brno Region. Further attention is paid to Gumbel class of distributions, which is frequently used in practice. Methods for statistical inference of multiply left-censored samples from exponential and Weibull distribution considering the type I censoring are developed and subsequently used in the analysis of synthetic musk compounds concentrations. The last part of the thesis deals with the extreme value theory for two-dimensional observations. Demonstrational software for the extreme value distributions was developed as a part of this thesis.
95

Updating Rainfall Intensity-Duration-Frequency Curves in Sweden Accounting for the Observed Increase in Rainfall Extremes / Uppdatering av Intensitets-Varaktighetskurvor i Sverige med hänsyn till observera- de ökande trender av extrem nederbörd

Eckersten, Sofia January 2016 (has links)
Increased extreme precipitation has been documented in many regions around the world, in- cluding central and northern Europe. Global warming increases average temperature, which in turn enhances atmospheric water holding capacity. These changes are believed to increase the frequency and/or intensity of extreme precipitation events. In determining the design storm, or a worst probable storm, for infrastructure design and failure risk assessment, experts commonly assume that statistics of extreme precipitation do not change significantly over time. This so- called notion of stationarity assumes that the statistics of future extreme precipitation events will be similar to those of historical observations. This study investigates the consequences of using a stationary assumption as well as the alternative: a non-stationary framework that con- siders temporal changes in statistics of extremes. Here we evaluate stationary and non-stationary return levels for 10-year to 50-year extreme precipitation events for different durations (1-day, 2-day, ..., 7-day precipitation events), based on the observed daily precipitation from Sweden. Non-stationary frequency analysis is only considered for stations with statistically significant trends over the past 50 years at 95% confidence (i.e., 15 to 39 % out of 139 stations, depend- ing on duration, 1-day, 2-day, ..., 7-day). We estimate non-stationary return levels using the General Extreme Value distribution with time-dependent parameters, inferred using a Bayesian approach. The estimated return levels are then compared in terms of duration, recurrence in- terval and location. The results indicate that a stationary assumption might, when a significant trend exists, underestimate extreme precipitation return levels by up to 40 % in Sweden. This report highlights the importance of considering better methods for estimating the recurrence in- terval of extreme events in a changing climate. This is particularly important for infrastructure design and risk reduction. / Ökad extrem nederbörd har dokumenterats globalt, däribland centrala och norra Europa. Den globala uppvärmningen medför en förhöjd medeltemperatur vilket i sin tur ökar avdunstning av vatten från ytor samt atmosfärens förmåga att hålla vatten. Dessa förändringar tros kunna öka och intensifiera nederbörd. Vid bestämning av dimensionerande nederbördsintensiteter för byggnationsprojekt antas idag att frekvensen och storleken av extrem nederbörd inte kommer att förändras i framtiden (stationäritet), vilket i praktiken innebär ingen förändring i klimatet. Den här studien syftar till att undersöka effekten av en icke-stationärt antagande vid skattning av dimensionerande nederbördsintensitet. Icke-stationära och stationära nerderbördsintensiteter föråterkomsttider mellan 10 och 100år bestämdes utifrån daglig och flerdaglig svensk nederbörds- data. Nederbördintensiteterna bestämdes med extremvärdesanalys i mjukvaran NEVA, där den generella extremvärdesfördelningen anpassades till årlig maximum nederbörd på platser i Sverige som påvisade en ökande trend under de senaste 50åren (15% till 39 % utav 139 stationer, beroende på varaktighet). De dimensionerande nederbördsintensiteterna jämfördes sedan med avseende på varaktighet, återkomsttid och plats. Resultaten indikerade på att ett stationärt antagande riskerar att underskatta dimensionerande nederbördsintensiteter för en viss återkomsttid med upp till 40 %. Detta indikerar att antagandet om icke-stationäritet har större betydelse för olika platser i Sverige, vilket skulle kunna ge viktig information vid bestämning av dimensionerande regnintensiteter.
96

Development of value at risk measures : towards an extreme value approach

Ganief, Moegamad Shahiem 12 1900 (has links)
Thesis (MBA)--Stellenbosch University, 2001. / ENGLISH ABSTRACT: Commercial banks, investment banks, insurance companies, non-financial firms, and pension funds hold portfolios of assets that may include stocks, bonds, currencies, and derivatives. Each institution needs to quantify the amount of risk its portfolio is exposed to in the course of a day, week, month, or year. Extreme events in financial markets, such as the stock market crash of October 1987, are central issues in finance and particularly in risk management and financial regulation. A method called value at risk (VaR) can be used to estimate market risk. Value at risk is a powerful measure of risk that is gaining wide acceptance amongst institutions for the management of market risk. Value at Risk is an estimate of the largest lost that a portfolio is likely to suffer during all but truly exceptional periods. More precisely, the VaR is the maximum loss that an institution can be confident it would lose a certain fraction of the time over a particular period. The power of the concept is its generality. VaR measures are applicable to entire portfolios - encompassing many asset categories and multiple sources of risk. As with its power, the challenge of calculating VaR also stems from its generality. In order to measure risk in a portfolio using VaR, some means must be found for determining a return distribution for the portfolio. There exists a wide range of literature on different methods of implementing VaR. But, when one attempts to apply the results, several questions remain open. For example, given a VaR measure, how can the risk manager test that the particular measure at hand is appropriately specified? And secondly, given two different VaR measures, how can the risk manager pick the best measure? Despite the popularity of VaR for measuring market risk, no consensus has yet been reach as to the best method to implement this risk measure. The absence of consensus is in part derived from the realization that each method currently in use has some significant drawbacks. The aim of this project is threefold: to introduce the reader to the concept of VaR; present the theoretical basis for the general approaches to VaR computations; and to introduce and apply Extreme Value Theory to VaR calculations. The general approaches to VaR computation falls into three categories, namely, Analytic (Parametric) Approach, Historical Simulation Approach, and Monte Carlo Simulation Approach. Each of these approaches has its strengths and weaknesses, which will study more closely. The extreme value approach to VaR calculation is a relatively new approach. Since most observed returns are central ones, traditional VaR methods tend to ignore extreme events and focus on risk measures that accommodate the whole empirical distribution of central returns. The danger of this approach is that these models are prone to fail just when they are needed most - in large market moves, when institutions can suffer very large losses. The extreme value approach is a tool that attempts to provide the user with the best possible estimate of the tail area of the distribution. Even in the absence of useful historical data, extreme value theory provides guidance on the kind of distribution that should be selected so that extreme risks are handled conservatively. As an illustration, the extreme value method will be applied to a foreign exchange futures contract. The validity of EVT to VaR calculations will be tested by examining the data of the Rand/Dollar One Year Futures Contracts. An extended worked example will be provided wherein which attempts to highlight the considerable strengths of the methods as well as the pitfalls and limitations. These results will be compared to VaR measures calculated using a GARCH(l,l) model. / AFRIKAANSE OPSOMMING: Handelsbanke, aksepbanke, assuransiemaatskappye, nie-finansiële instellings en pensioenfondse beskik oor portefeuljes van finansiële bates soos aandele, effekte, geldeenhede en afgeleides. Elke instelling moet die omvang kan bepaal van die risiko waaraan die portefeulje blootgestel is in die loop van 'n dag, week, maand of jaar. Uitsonderlike gebeure op finansiële markte, soos die ineenstorting van die aandelemark in Oktober 1987, is van besondere belang vir finansies en veral vir risikobestuur en finansiële regulering. 'n Metode wat genoem word Waarde op Risiko (WoR), kan gebruik word om markverliese te meet. WoR is 'n kragtige maatstaf vir risiko en word deur vele instellings gebruik vir die bestuur van mark-risiko. Waarde op Risiko is 'n raming van die grootste verlies wat 'n portefeulje moontlik kan ly gedurende enige tydperk, met uitsluiting van werklik uitsonderlike tydperke. Van nader beskou, is WoR die maksimum verlies wat 'n instelling kan verwag om gedurende 'n sekere tydperk binne 'n bepaalde periode te ly. Die waarde van die konsep lê in die algemene aard daarvan. WoR metings is van toepassing op portefeuljes in dié geheel en dit omvat baie kategorieë bates en veelvuldige bronne van risiko. Soos met die waarde van die konsep, hou die uitdaging om WoR te bereken ook verband met die algemene aard van die konsep. Ten einde die risiko te bepaal in 'n portefeulje waar WoR gebruik word, moet metodes gevind word waarvolgens 'n opbrengsverdeling vir die portefeulje vasgestel kan word. Daar bestaan 'n groot verskeidenheid literatuur oor die verskillende metodes om WoR te implementeer. Wanneer dit egter kom by die toepassing van die resultate, bly verskeie vrae onbeantwoord. Byvoorbeeld, hoe kan die risikobestuurder aan die hand van 'n gegewe WoR-maatstaf toets of die spesifieke maatstaf reg gespesifiseer is? Tweedens, hoe kan die risikobestuurder die beste maatstaf kies in die geval van twee verskillende WoR-maatstawwe? Ondanks die feit dat WoR algemeen gebruik word vir die meting van markrisiko, is daar nog nie konsensus bereik oor die beste metode om hierdie benadering tot risikometing te implementeer nie. Die feit dat daar nie konsensus bestaan nie, kan deels daaraan toegeskryf word dat elkeen van die metodes wat tans gebruik word, ernstige leemtes het. Die doel van hierdie projek is om die konsep WoR bekend te stel, om die teoretiese grondslag te lê vir die algemene benadering tot die berekening van WoR en om die Ekstreme Waarde-teorie bekend te stel en toe te pas op WoR-berekenings. Die algemene benadering tot die berekening van WoR word in drie kategorieë verdeel naamlik die Analitiese (Parametriese) benadering, die Historiese simulasiebenadering en die Monte Carlo-simulasiebenadering. Elkeen van die benaderings het sterk- en swakpunte wat van nader ondersoek sal word. Die Ekstreme Waarde-benadering tot WoR is 'n relatief nuwe benadering. Aangesien die meeste opbrengste middelwaarde-gesentreer is, is tradisionele WoR-metodes geneig om uitsonderlike gebeure buite rekening te laat en te fokus op risiko-maatstawwe wat die hele empiriese verdeling van middelwaarde-gesentreerde opbrengste akkommodeer. Die gevaar bestaan dan dat hierdie modelle geneig is om te faal juis wanneer dit die meeste benodig word, byvoorbeeld in die geval van groot markverskuiwings waartydens organisasies baie groot verliese kan ly. Daar word beoog om met behulp van die Ekstreme Waarde-benadering aan die gebruiker die beste moontlike skatting van die stert-area van die verdeling te gee. Selfs in die afwesigheid van bruikbare historiese data verskaf die Ekstreme Waarde-teorie riglyne ten opsigte van die aard van die verdeling wat gekies moet word, sodat uiterste risiko's versigtig hanteer kan word. Ten einde hierdie metode te illustreer, word dit in hierdie studie toegepas op 'n termynkontrak ten opsigte van buitelandse wisselkoerse. Die geldigheid van die Ekstreme Waarde-teorie ten opsigte van WoR berekenings word getoets deur die data van die Rand/Dollar Eenjaartermynkontrak te bestudeer. 'n Volledig uitgewerkte voorbeeld word verskaf waarin die slaggate en beperkings asook die talle sterkpunte van die model uitgewys word. Hierdie resultate sal vergelyk word met 'n WoR-meting wat bereken is met die GARCH (1,1) model.
97

Stability of the Financial System: Systemic Dependencies between Bank and Insurance Sectors / Stability of the Financial System: Systemic Dependencies between Bank and Insurance Sectors

Procházková, Jana January 2014 (has links)
The central issue of this thesis is investigating the eventuality of systemic break- downs in the international financial system through examining systemic depen- dence between bank and insurance sectors. Standard models of systemic risk often use correlation of stock returns to evaluate the magnitude of intercon- nectedness between financial institutions. One of the main drawbacks of this approach is that it is oriented towards observations occurring along the central part of the distribution and it does not capture the dependence structure of outlying observations. To account for that, we use methodology which builds on the Extreme Value Theory and is solely focused on capturing dependence in extremes. The analysis is performed using the data on stock prices of the EU largest banks and insurance companies. We study dependencies in the pre- crisis and post-crisis period. The objective is to discover which sector poses a higher systemic threat to the international financial stability. Also, we try to find empirical evidence about an increase in interconnections in recent post- crisis years. We find that in both examined periods systemic dependence in the banking sector is higher than in the insurance sector. Our results also in- dicate that extremal interconnections in the respective sectors increased,...
98

[en] EXTREME VALUE THEORY: VALUE AT RISK FOR VARIABLE-INCOME ASSETS / [pt] TEORIA DOS VALORES EXTREMOS: VALOR EM RISCO PARA ATIVOS DE RENDA VARIÁVEL

GUSTAVO LOURENÇO GOMES PIRES 26 June 2008 (has links)
[pt] A partir da década de 90, a metodologia de Valor em Risco (VaR) se difundiu pelo mundo, tanto em instituições financeiras quanto em não financeiras, como uma boa prática de mensuração de riscos. Um dos fatos estilizados mais pronunciados acerca das distribuições de retornos financeiros diz respeito à presença de caudas pesadas. Isso torna os modelos paramétricos tradicionais de cálculo de Valor em Risco (VaR) inadequados para a estimação de VaR de baixas probabilidades, dado que estes se baseiam na hipótese de normalidade para as distribuições dos retornos. Sendo assim, o objetivo do presente trabalho é investigar o desempenho de modelos baseados na Teoria dos Valores Extremos para o cálculo do VaR. Os resultados indicam que os modelos baseados na Teoria dos Valores Extremos são adequados para a modelagem das caudas, e consequentemente para a estimação de Valor em Risco quando os níveis de probabilidade de interesse são baixos. / [en] Since the 90 decade, the use of Value at Risk (VaR) methodology has been disseminated among both financial and non-financial institutions around the world, as a good practice in terms of risks management. The existence of fat tails is one of the striking stylized facts of financial returns distributions. This fact makes the use of traditional parametric models for Value at Risk (VaR) estimation unsuitable for the estimation of low probability events. This is because traditional models are based on the conditional normality assumption for financial returns distributions. The main purpose of this dissertation is to investigate the performance of VaR models based on Extreme Value Theory. The results indicates that Extreme Value Theory based models are suitable for low probability VaR estimation.
99

Local Likelihood Approach for High-Dimensional Peaks-Over-Threshold Inference

Baki, Zhuldyzay 14 May 2018 (has links)
Global warming is affecting the Earth climate year by year, the biggest difference being observable in increasing temperatures in the World Ocean. Following the long- term global ocean warming trend, average sea surface temperatures across the global tropics and subtropics have increased by 0.4–1◦C in the last 40 years. These rates become even higher in semi-enclosed southern seas, such as the Red Sea, threaten- ing the survival of thermal-sensitive species. As average sea surface temperatures are projected to continue to rise, careful study of future developments of extreme temper- atures is paramount for the sustainability of marine ecosystem and biodiversity. In this thesis, we use Extreme-Value Theory to study sea surface temperature extremes from a gridded dataset comprising 16703 locations over the Red Sea. The data were provided by Operational SST and Sea Ice Analysis (OSTIA), a satellite-based data system designed for numerical weather prediction. After pre-processing the data to account for seasonality and global trends, we analyze the marginal distribution of ex- tremes, defined as observations exceeding a high spatially varying threshold, using the Generalized Pareto distribution. This model allows us to extrapolate beyond the ob- served data to compute the 100-year return levels over the entire Red Sea, confirming the increasing trend of extreme temperatures. To understand the dynamics govern- ing the dependence of extreme temperatures in the Red Sea, we propose a flexible local approach based on R-Pareto processes, which extend the univariate Generalized Pareto distribution to the spatial setting. Assuming that the sea surface temperature varies smoothly over space, we perform inference based on the gradient score method over small regional neighborhoods, in which the data are assumed to be stationary in space. This approach allows us to capture spatial non-stationarity, and to reduce the overall computational cost by taking advantage of distributed computing resources. Our results reveal an interesting extremal spatial dependence structure: in particular, from our estimated model, we conclude that significant extremal dependence prevails for distances up to about 2500 km, which roughly corresponds to the Red Sea length.
100

Application of Scientific Computing and Statistical Analysis to address Coastal Hazards / Application du Calcul Scientifique et de l'Analyse Statistique à la Gestion du Risque en Milieu Littoral

Chailan, Romain 23 November 2015 (has links)
L'étude et la gestion des risques littoraux sont plébiscitées par notre société au vu des enjeux économiques et écologiques qui y sont impliqués. Ces risques sont généralement réponse à des conditions environnementales extrêmes. L'étude de ces phénomènes physiques repose sur la compréhension de ces conditions rarement (voire nullement) observées.Dans un milieu littoral, la principale source d'énergie physique est véhiculée par les vagues. Cette énergie est responsable des risques littoraux comme l'érosion et la submersion qui évoluent à des échelles de temps différentes (événementielle ou long-terme). Le travail réalisé, situé à l'interface de l'analyse statistique, de la géophysique et de l'informatique, vise à apporter des méthodologies et outils aux décideurs en charge de la gestion de tels risques.En pratique, nous nous intéressons à mettre en place des méthodes qui prennent en compte non seulement un site ponctuel mais traitent les problématiques de façon spatiale. Ce besoin provient de la nature même des phénomènes environnementaux qui sont spatiaux, tels les champs de vagues.L'étude des réalisations extrêmes de ces processus repose sur la disponibilité d'un jeu de données représentatif à la fois dans l'espace et dans le temps, permettant de projeter l'information au-delà de ce qui a déjà été observé. Dans le cas particulier des champs de vagues, nous avons recours à la simulation numérique sur calculateur haute performance (HPC) pour réaliser un tel jeu de données. Le résultat de ce premier travail offre de nombreuses possibilités d'applications.En particulier, nous proposons à partir de ce jeu de données deux méthodologies statistiques qui ont pour but respectif de répondre aux problématiques de risques littoraux long-termes (érosion) et à celles relatives aux risques événementiels (submersion). La première s'appuie sur l'application de modèles stochastiques dit max-stables, particulièrement adapté à l'étude des événements extrêmes. En plus de l'information marginale, ces modèles permettent de prendre en compte la structure de dépendance spatiale des valeurs extrêmes. Nos résultats montrent l'intérêt de cette méthode au devant de la négligence de la dépendance spatiale de ces phénomènes pour le calcul d'indices de risque.La seconde approche est une méthode semi-paramétrique dont le but est de simuler des champs spatio-temporels d'états-de-mer extrêmes. Ces champs, interprétés comme des tempêtes, sont des amplifications contrôlées et bi-variés d'épisodes extrêmes déjà observés. Ils forment donc des tempêtes encore plus extrêmes. Les tempêtes simulées à une intensité contrôlée alimentent des modèles physiques événementiels à la côte, permettant d'aider les décideurs à l'anticipation de ces risques encore non observés.Enfin et depuis la construction de ces scenarii extrêmes, nous abordons la notion de pré-calcul dans le but d'apporter en quasi-temps réel au décideur et en tant de crise une prévision sur le risque littoral.L’ensemble de ce travail s'inscrit dans le cadre d'un besoin industriel d’aide à la modélisation physique : chainage de modèles numériques et statistiques. La dimension industrielle de cette thèse est largement consacrée à la conception et au développement d’un prototype de plateforme de modélisation permettant l’utilisation systématique d’un calculateur HPC pour les simulations et le chainage de modèles de façon générique.Autour de problématiques liées à la gestion du risque littoral, cette thèse démontre l'apport d'un travail de recherche à l'interface de plusieurs disciplines. Elle y répond en conciliant et proposant des méthodes de pointe prenant racine dans chacune de ces disciplines. / Studies and management of coastal hazards are of high concerns in our society, since they engage highly valuable economical and ecological stakes. Coastal hazards are generally responding to extreme environmental conditions. The study of these physical phenomena relies on the understanding of such environmental conditions, which are rarely (or even never) observed.In coastal areas, waves are the main source of energy. This energy is responsible of coastal hazards developed at different time-scales, like the submersion or the erosion.The presented work, taking place at the interface between Statistical Analysis, Geophysics and Computer Sciences, aiming at bringing forward tools and methods serving decision makers in charge of the management of such risks.In practice, the proposed solutions answer to the questionings with a consideration of the space dimension rather than only punctual aspects. This approach is more natural considering that environmental phenomena are generally spatial, as the sea-waves fields.The study of extreme realisations of such processes is based on the availability of a representative data set, both in time and space dimensions, allowing to extrapolating information beyond the actual observations. In particular for sea-waves fields, we use numerical simulation on high performance computational clusters (HPC) to product such a data set. The outcome of this work offers many application possibilities.Most notably, we propose from this data set two statistical methodologies, having respective goals of dealing with littoral hazards long-terms questionings (e.g., erosion) and event-scale questionings (e.g., submersion).The first one is based on the application of stochastic models so-called max-stable models, particularly adapted to the study of extreme values in a spatial context. Indeed, additionally to the marginal information, max-stable models allow to take into account the spatial dependence structures of the observed extreme processes. Our results show the interest of this method against the ones neglecting the spatial dependence of these phenomena for risk indices computation.The second approach is a semi-parametric method aiming at simulating extreme waves space-time processes. Those processes, interpreted as storms, are controlled and bi-variate uplifting of already observed extreme episodes. In other words, we create most severe storms than the one already observed. These processes simulated at a controlled intensity may feed littoral physical models in order to describe a very extreme event in both space and time dimensions. They allow helping decision-makers in the anticipation of hazards not yet observed.Finally and from the construction of these extreme scenarios, we introduce a pre-computing paradigm in the goal of providing the decision-makers with a real-time and accurate information in case of a sudden coastal crisis, without performing any physical simulation.This work fits into a growing industrial demand of modelling help. Most notably a need related to the chaining of numerical and statistical models. Consequently, the industrial dimension of this PhD.~is mostly dedicated to the design and development of a prototype modelling platform. This platform aims at systematically using HPC resources to run simulations and easing the chaining of models.Embracing solutions towards questionings related to the management of coastal hazard, this thesis demonstrates the benefits of a research work placed at the interface between several domains. This thesis answers such questionings by providing end-users with cutting-edge methods stemming from each of those domains.

Page generated in 0.0352 seconds