• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 81
  • 14
  • 13
  • 7
  • 6
  • 5
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 141
  • 141
  • 141
  • 52
  • 32
  • 24
  • 21
  • 19
  • 19
  • 18
  • 18
  • 17
  • 17
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Analyse régionale des aléas maritimes extrêmes / Regional frequency analysis of extreme marine hazards

Weiss, Jérôme 07 November 2014 (has links)
Connaître la probabilité d'occurrence des aléas océano-météorologiques extrêmes est fondamental pour prévenir les risques de submersion marine en zone côtière ou concevoir des aménagements côtiers, portuaires ou des plate-formes offshore. Notamment, le concept de niveau de retour est fréquemment utilisé en ingénierie côtière pour dimensionner des ouvrages de protection. Ces niveaux, dont les périodes de retour d'intérêt se situent généralement entre 100 et 1000 ans, sont habituellement estimés par une analyse statistique locale, à partir de données observées en un site unique. Cependant, la période d'observation est généralement limitée, de sorte que les incertitudes associées aux niveaux de retour élevés sont importantes. L'analyse régionale représente une solution possible pour réduire les incertitudes inhérentes aux analyses locales. Le principe est d'exploiter l'information de sites d'observation provenant d'une région homogène, où les extrêmes sont supposés avoir un comportement probabiliste similaire. L'analyse régionale peut ainsi estimer les niveaux de retour de manière plus fiable qu'une analyse locale. Cependant, son application dans le domaine maritime étant relativement limitée et récente, différentes questions méthodologiques de meurent non-Résolues, comme la formation des régions homogènes ou le traitement de la dépendance entre sites. L'objectif scientifique de la thèse est donc d'approfondir certains points méthodologiques de l'analyse régionale, dans le cadre des aléas maritimes extrêmes. Les points suivants sont abordés en particulier :• Échantillonnage des extrêmes pour l'analyse régionale, à partir des tempêtes détectées via une procédure de declustering spatio-Temporel.• Formation de régions homogènes à partir d'une méthode basée sur l'identification des empreintes typiques des tempêtes.• Prise en compte de la dépendance entre sites d'observation, à travers la construction d'un modèle permettant par exemple d'évaluer la durée effective régionale d'observation ou la période de retour régionale d'une tempête.• Spécification et estimation de la loi régionale, avec incorporation des co-variables influentes, comme la saison d'occurrence ou la direction de provenance pour les vagues.• Comparaison entre analyses locale et régionale, notamment à travers les incertitudes sur les estimations des extrêmes et la capacité à modéliser les horsains présumés.Ces aspects sont illustrés sur des données de hauteurs significatives de vagues et de surcotes de pleine mer, dans la zone Atlantique Nord-Est, Manche et Mer du Nord.Parallèlement, l'objectif applicatif de ces travaux est de contribuer à garantir la sûreté des ouvrages EDF contre le risque de submersion marine. Ceci peut être réalisé grâce à l'exploration de nouvelles techniques d'estimation des aléas maritimes extrêmes telles que l'analyse régionale, qui permet notamment une meilleure prise en compte des horsains. / The knowledge of the probability of occurrence of oceano-Meteorological extremes is essential to prevent risks of coastal flooding or to build coastal protections or off-Shore structures. In particular, the concept of return level is frequently used in coastal engineering to design protection structures. These levels, whose return periods of interest generally lie between 100 and 1000 years, are usually estimated by a local statistical analysis, from data observed at a unique site. However, the period of observation is generally limited, which can imply high uncertainties for high return levels. Regional frequency analysis is a possible solution to reduce uncertainties inherent to local analyses. The principle is to exploit the information of sites of observation from a homogeneous region, where extremes are supposed to share a similar probabilistic behavior. Thus, regional frequency analysis can estimate return levels more accurately than a local analysis. However, its application to the marine field being relatively limited and recent, several methodological questions are still unsolved, such as the formation of homogeneous regions or the dependence between sites. The scientific objective of this thesis is thus to develop some methodological points of regional frequency analysis, in the framework of extreme marine hazards. The following questions are tackled:• Sampling of extremes for regional analysis, from the storms detected through a spatiotemporal declustering procedure.• Formation of homogeneous regions from a method based on the identification of the typical storms footprints.• Consideration of the dependence between sites of observation, through the building of a model allowing, for example, to assess the regional effective duration or the regional return period of a storm.• Specification and estimation of the regional distribution, with the incorporation of influent covariables, such as the season of occurrence or the direction for waves.• Comparison between regional and local analyses, especially through the uncertainties on the estimated extremes and the ability to model the potential outliers. These aspects are illustrated on significant wave height data and skew surge data located in the Northeast Atlantic, the Eastern Channel and the North Sea. At the same time, the industrial objective of this work is to contribute to guarantee the safety of EDF structures against the risk of coastal flooding. This can be achieved through the exploration of new techniques of estimation of extreme marine hazards such as regional frequency analysis, which allows in particular a better representation of outliers
42

A study on the theoretical predictability of extreme value distributions for natural catastrophic events / Studie teoretické predikovatelnosti extremálních rozdělení pro přírodní katastrofy

Sabolová, Radka January 2013 (has links)
The thesis deals with natural disasters from the statistical point of view and treats them as extremal observations. Basics of classical extreme value theory will be summarized and new approach based on maximum entropy principle will be proposed. Both methods will be used in order to analyze real discharge data observed at the river Vltava.
43

New statistical models for extreme values

Eljabri, Sumaya Saleh M. January 2013 (has links)
Extreme value theory (EVT) has wide applicability in several areas like hydrology, engineering, science and finance. Across the world, we can see the disruptive effects of flooding, due to heavy rains or storms. Many countries in the world are suffering from natural disasters like heavy rains, storms, floods, and also higher temperatures leading to desertification. One of the best known extraordinary natural disasters is the 1931 Huang He flood, which led to around 4 millions deaths in China; these were a series of floods between Jul and Nov in 1931 in the Huang He river.Several publications are focused on how to find the best model for these events, and to predict the behaviour of these events. Normal, log-normal, Gumbel, Weibull, Pearson type, 4-parameter Kappa, Wakeby and GEV distributions are presented as statistical models for extreme events. However, GEV and GP distributions seem to be the most widely used models for extreme events. In spite of that, these models have been misused as models for extreme values in many areas.The aim of this dissertation is to create new modifications of univariate extreme value models.The modifications developed in this dissertation are divided into two parts: in the first part, we make generalisations of GEV and GP, referred to as the Kumaraswamy GEV and Kumaraswamy GP distributions. The major benefit of these models is their ability to fit the skewed data better than other models. The other idea in this study comes from Chen, which is presented in Proceedings of the International Conference on Computational Intelligence and Software Engineering, pp. 1-4. However, the cumulative and probability density functions for this distribution do not appear to be valid functions. The correction of this model is presented in chapter 6.The major problem in extreme event models is the ability of the model to fit tails of data. In chapter 7, the idea of the Chen model with the correction is combined with the GEV distribution to introduce a new model for extreme values referred to as new extreme value (NEV) distribution. It seems to be more flexible than the GEV distribution.
44

Accelerated testing with application in finance

Oppel, Anel January 2016 (has links)
The event of a default for low-default portfolios, such as sovereign debt or banks, have received much attention as a result of the increasing instabilities in financial markets. The lack of sufficient default information on low-default portfolios complicates the protection of such portfolios. Default protections have typically, in the past, relied on extreme value theory and reporting the value at risk. The focus here, is the application of an engineering concept, accelerated test techniques, to the problem of insufficient data on low-default portfolios. In the application, high-default portfolios serve as stressed cases of low-default portfolios. Since high-default portfolios have more data available, viewing it as a stressed case of a low-default portfolio enables us to extrapolate the data to the low-default portfolio environment, and do estimation such as estimating the default probability for a low-default portfolio. The flexible framework through which the above is achieved, is provided. / Dissertation (MSc)--University of Pretoria, 2016. / Statistics / MSc / Unrestricted
45

Mnohorozměrné modely extrémních hodnot a jejich aplikace v hydrologii / Multivariate extreme value models and their application in hydrology

Drápal, Lukáš January 2014 (has links)
Present thesis deals with the multivariate extreme value theory. First, concepts of modelling block maxima and threshold excesses in the univariate case are reviewed. In the multivariate setting the point process approach is chosen to model dependence. The dependence structure of multivariate extremes is provided by a spectral measure or an exponent function. Models for asymptotically dependent variables are provided. A construction principle from Ballani and Schlather (2011) is discussed. Based on this discussion the pairwise beta model introduced by Cooley et al. (2010) is modified to provide higher flexibility. Models are applied to data from nine hydrological stations from northern Moravia previously analysed by Jarušková (2009). Usage of the new pairwise beta model is justified as it brought a substantial improvement of log-likelihood. Models are also compared with Bayesian model selection introduced by Sabourin et al. (2013). Powered by TCPDF (www.tcpdf.org)
46

Statistics of Multivariate Extremes with Applications in Risk Management

Herrera, Rodrigo 06 July 2009 (has links)
The contributions of this thesis have mainly a dual purpose: introducing several multivariate statistical methodologies where in the major of the cases only stationary of the random variables is assumed, and also highlight some of the applied problems in risk management where extreme value theory may play a role. Mostly every chapter is selfcontained, they have its own more detailed introduction and short conclusion. / Die Kontributionen von dieser Dissertation haben ein doppeltes Ziel: die Darstellung von vielen multivariaten statistischen Verfahren, wobei in der Mehrheit der Fälle nur Stationarität von den Zufallsvariablen angenommen wurde, und die Anwendungen in Risikomanagement in welchem Extremwerttheorie eine wichtige Rolle spielen könnte. Die Struktur der Arbeit ist eigenständig, mit einer detaillierten Einführung und kurzen Zusammenfassung in jedem Kapitel.
47

Estimating expected shortfall using an unconditional peaks-over-threshold method under an extreme value approach

Wahlström, Rikard January 2021 (has links)
Value-at-Risk (VaR) has long been the standard risk measure in financial risk management. However, VaR suffers from critical shortcomings as a risk measure when it comes to quantifying the most severe risks, which was made especially apparent during the financial crisis of 2007–2008. An alternative risk measure addressing the shortcomings of VaR known as expected shortfall (ES) is gaining popularity and is set to replace VaR as the standard measure of financial risk. This thesis introduces how extreme value theory can be applied in estimating ES using an unconditional peaks-over-threshold method. This includes giving an introduction to the theoretical foundations of the method. An application of this method is also performed on five different assets. These assets are chosen to serve as a proxy for the more broad asset classes of equity, fixed income, currencies, commodities and cryptocurrencies. In terms of ES, we find that cryptocurrencies is the riskiest asset and fixed income the safest.
48

Apprentissage de structures dans les valeurs extrêmes en grande dimension / Discovering patterns in high-dimensional extremes

Chiapino, Maël 28 June 2018 (has links)
Nous présentons et étudions des méthodes d’apprentissage non-supervisé de phénomènes extrêmes multivariés en grande dimension. Dans le cas où chacune des distributions marginales d’un vecteur aléatoire est à queue lourde, l’étude de son comportement dans les régions extrêmes (i.e. loin de l’origine) ne peut plus se faire via les méthodes usuelles qui supposent une moyenne et une variance finies. La théorie des valeurs extrêmes offre alors un cadre adapté à cette étude, en donnant notamment une base théorique à la réduction de dimension à travers la mesure angulaire. La thèse s’articule autour de deux grandes étapes : - Réduire la dimension du problème en trouvant un résumé de la structure de dépendance dans les régions extrêmes. Cette étape vise en particulier à trouver les sous-groupes de composantes étant susceptible de dépasser un seuil élevé de façon simultané. - Modéliser la mesure angulaire par une densité de mélange qui suit une structure de dépendance déterminée à l’avance. Ces deux étapes permettent notamment de développer des méthodes de classification non-supervisée à travers la construction d’une matrice de similarité pour les points extrêmes. / We present and study unsupervised learning methods of multivariate extreme phenomena in high-dimension. Considering a random vector on which each marginal is heavy-tailed, the study of its behavior in extreme regions is no longer possible via usual methods that involve finite means and variances. Multivariate extreme value theory provides an adapted framework to this study. In particular it gives theoretical basis to dimension reduction through the angular measure. The thesis is divided in two main part: - Reduce the dimension by finding a simplified dependence structure in extreme regions. This step aim at recover subgroups of features that are likely to exceed large thresholds simultaneously. - Model the angular measure with a mixture distribution that follows a predefined dependence structure. These steps allow to develop new clustering methods for extreme points in high dimension.
49

Optimization under Uncertainty with Applications in Data-driven Stochastic Simulation and Rare-event Estimation

Zhang, Xinyu January 2022 (has links)
For many real-world problems, optimization could only be formulated with partial information or subject to uncertainty due to reasons such as data measurement error, model misspecification, or that the formulation depends on the non-stationary future. It thus often requires one to make decisions without knowing the problem's full picture. This dissertation considers the robust optimization framework—a worst-case perspective—to characterize uncertainty as feasible regions and optimize over the worst possible scenarios. Two applications in this worst-case perspective are discussed: stochastic estimation and rare-event simulation. Chapters 2 and 3 discuss a min-max framework to enhance existing estimators for simulation problems that involve a bias-variance tradeoff. Biased stochastic estimators, such as finite-differences for noisy gradient estimation, often contain parameters that need to be properly chosen to balance impacts from the bias and the variance. While the optimal order of these parameters in terms of the simulation budget can be readily established, the precise best values depend on model characteristics that are typically unknown in advance. We introduce a framework to construct new classes of estimators, based on judicious combinations of simulation runs on sequences of tuning parameter values, such that the estimators consistently outperform a given tuning parameter choice in the conventional approach, regardless of the unknown model characteristics. We argue the outperformance via what we call the asymptotic minimax risk ratio, obtained by minimizing the worst-case asymptotic ratio between the mean square errors of our estimators and the conventional one, where the worst case is over any possible values of the model unknowns. In particular, when the minimax ratio is less than 1, the calibrated estimator is guaranteed to perform better asymptotically. We identify this minimax ratio for general classes of weighted estimators and the regimes where this ratio is less than 1. Moreover, we show that the best weighting scheme is characterized by a sum of two components with distinct decay rates. We explain how this arises from bias-variance balancing that combats the adversarial selection of the model constants, which can be analyzed via a tractable reformulation of a non-convex optimization problem. Chapters 4 and 5 discuss extreme event estimation using a distributionally robust optimization framework. Conventional methods for extreme event estimation rely on well-chosen parametric models asymptotically justified from extreme value theory (EVT). These methods, while powerful and theoretically grounded, could however encounter difficult bias-variance tradeoffs that exacerbates especially when data size is too small, deteriorating the reliability of the tail estimation. The chapters study a framework based on the recently surging literature of distributionally robust optimization. This approach can be viewed as a nonparametric alternative to conventional EVT, by imposing general shape belief on the tail instead of parametric assumption and using worst-case optimization as a resolution to handle the nonparametric uncertainty. We explain how this approach bypasses the bias-variance tradeoff in EVT. On the other hand, we face a conservativeness-variance tradeoff which we describe how to tackle. We also demonstrate computational tools for the involved optimization problems and compare our performance with conventional EVT across a range of numerical examples.
50

Modeling and Inference for Multivariate Time Series, with Applications to Integer-Valued Processes and Nonstationary Extreme Data

Guerrero, Matheus B. 04 1900 (has links)
This dissertation proposes new statistical methods for modeling and inference for two specific types of time series: integer-valued data and multivariate nonstationary extreme data. We rely on the class of integer-valued autoregressive (INAR) processes for the former, proposing a novel, flexible and elegant way of modeling count phenomena. As for the latter, we are interested in the human brain and its multi-channel electroencephalogram (EEG) recordings, a natural source of extreme events. Thus, we develop new extreme value theory methods for analyzing such data, whether in modeling the conditional extremal dependence for brain connectivity or clustering extreme brain communities of EEG channels. Regarding integer-valued time series, INAR processes are generally defined by specifying the thinning operator and either the innovations or the marginal distributions. The major limitations of such processes include difficulties deriving the marginal properties and justifying the choice of the thinning operator. To overcome these drawbacks, this dissertation proposes a novel approach for building an INAR model that offers the flexibility to prespecify both marginal and innovation distributions. Thus, the thinning operator is no longer subjectively selected but is rather a direct consequence of the marginal and innovation distributions specified by the modeler. Novel INAR processes are introduced following this perspective; these processes include a model with geometric marginal and innovation distributions (Geo-INAR) and models with bounded innovations. We explore the Geo-INAR model, which is a natural alternative to the classical Poisson INAR model. The Geo-INAR process has interesting stochastic properties, such as MA($\infty$) representation, time reversibility, and closed forms for the $h$-th-order transition probabilities, which enables a natural framework to perform coherent forecasting. In the front of multivariate nonstationary extreme data, the focus lies on multi-channel epilepsy data. Epilepsy is a chronic neurological disorder affecting more than 50 million people globally. An epileptic seizure acts like a temporary shock to the neuronal system, disrupting normal electrical activity in the brain. Epilepsy is frequently diagnosed with EEGs. Current statistical approaches for analyzing EEGs use spectral and coherence analysis, which do not focus on extreme behavior in EEGs (such as bursts in amplitude), neglecting that neuronal oscillations exhibit non-Gaussian heavy-tailed probability distributions. To overcome this limitation, this dissertation proposes new approaches to characterize brain connectivity based on extremal features of EEG signals. Two extreme-valued methods to study alterations in the brain network are proposed. One method is Conex-Connect, a pioneering approach linking the extreme amplitudes of a reference EEG channel with the other channels in the brain network. The other method is Club Exco, which clusters multi-channel EEG data based on a spherical $k$-means procedure applied to the "pseudo-angles," derived from extreme amplitudes of EEG signals. Both methods provide new insights into how the brain network organizes itself during an extreme event, such as an epileptic seizure, in contrast to a baseline state.

Page generated in 0.0498 seconds