Spelling suggestions: "subject:"extremely value analysis.""
1 |
Incorporating geologic information into hydraulic tomography: A general framework based on geostatistical approachZha, Yuanyuan, Yeh, Tian-Chyi J., Illman, Walter A., Onoe, Hironori, Mok, Chin Man W., Wen, Jet-Chau, Huang, Shao-Yang, Wang, Wenke 04 1900 (has links)
Hydraulic tomography (HT) has become a mature aquifer test technology over the last two decades. It collects nonredundant information of aquifer heterogeneity by sequentially stressing the aquifer at different wells and collecting aquifer responses at other wells during each stress. The collected information is then interpreted by inverse models. Among these models, the geostatistical approaches, built upon the Bayesian framework, first conceptualize hydraulic properties to be estimated as random fields, which are characterized by means and covariance functions. They then use the spatial statistics as prior information with the aquifer response data to estimate the spatial distribution of the hydraulic properties at a site. Since the spatial statistics describe the generic spatial structures of the geologic media at the site rather than site-specific ones (e. g., known spatial distributions of facies, faults, or paleochannels), the estimates are often not optimal. To improve the estimates, we introduce a general statistical framework, which allows the inclusion of site-specific spatial patterns of geologic features. Subsequently, we test this approach with synthetic numerical experiments. Results show that this approach, using conditional mean and covariance that reflect site-specific large-scale geologic features, indeed improves the HT estimates. Afterward, this approach is applied to HT surveys at a kilometerscale- fractured granite field site with a distinct fault zone. We find that by including fault information from outcrops and boreholes for HT analysis, the estimated hydraulic properties are improved. The improved estimates subsequently lead to better prediction of flow during a different pumping test at the site.
|
2 |
An empirical comparison of extreme value modelling procedures for the estimation of high quantilesEngberg, Alexander January 2016 (has links)
The peaks over threshold (POT) method provides an attractive framework for estimating the risk of extreme events such as severe storms or large insurance claims. However, the conventional POT procedure, where the threshold excesses are modelled by a generalized Pareto distribution, suffers from small samples and subjective threshold selection. In recent years, two alternative approaches have been proposed in the form of mixture models that estimate the threshold and a folding procedure that generates larger tail samples. In this paper the empirical performances of the conventional POT procedure, the folding procedure and a mixture model are compared by modelling data sets on fire insurance claims and hurricane damage costs. The results show that the folding procedure gives smaller standard errors of the parameter estimates and in some cases more stable quantile estimates than the conventional POT procedure. The mixture model estimates are dependent on the starting values in the numerical maximum likelihood estimation, and are therefore difficult to compare with those from the other procedures. The conclusion is that none of the procedures is overall better than the others but that there are situations where one method may be preferred.
|
3 |
Bivariate extreme value analysis of commodity pricesJoyce, Matthew 21 April 2017 (has links)
The crude oil, natural gas, and electricity markets are among the most widely traded and talked about commodity markets across the world. Over the past two decades each commodity has seen price volatility due to political, economic, social, and technological reasons. With that comes a significant amount of risk that both corporations and governments must account for to ensure expected cash flows and to minimize losses. This thesis analyzes the portfolio risk of the major US commodity hubs for crude oil, natural gas and electricity by applying Extreme Value Theory to historical daily price returns between 2003 and 2013. The risk measures used to analyze risk are Value-at-Risk and Expected Shortfall, with these estimated by fitting the Generalized Pareto Distribution to the data using the peak-over-threshold method. We consider both the univariate and bivariate cases in order to determine the effects that price shocks within and across commodities will have in a mixed portfolio. The results show that electricity is the most volatile, and therefore most risky, commodity of the three markets considered for both positive and negative returns. In addition, we find that the univariate and bivariate results are statistically indistinguishable, leading to the conclusion that for the three markets analyzed during this period, price shocks in one commodity does not directly impact the volatility of another commodity’s price. / Graduate
|
4 |
Discrete Parameter Estimation for Rare Events: From Binomial to Extreme Value DistributionsSchneider, Laura Fee 26 April 2019 (has links)
No description available.
|
5 |
Scaling and Extreme Value Statistics of Sub-Gaussian Fields with Application to Neutron Porosity DataNan, Tongchao January 2014 (has links)
My dissertation is based on a unified self-consistent scaling framework which is consistent with key behavior exhibited by many spatially/temporally varying earth, environmental and other variables. This behavior includes tendency of increments to have symmetric, non-Gaussian frequency distributions characterized by heavy tails that often decay with lag; power-law scaling of sample structure functions (statistical moments of absolute increments) in midranges of lags, with breakdown in power-law scaling at small and/or large lags; linear relationships between log structure functions of successive orders at all lags, also known as extended self-similarity; and nonlinear scaling of structure function power-law exponents with function order. The major question we attempt to answer is: given data measured on a given support scale at various points throughout a 1D/2D/3D sampling domain, which appear to be statistically distributed and to scale in a manner consistent with that scaling framework, what can be said about the spatial statistics and scaling of its extreme values, on arbitrary separation or domain scales? To do so, we limit our investigation in 1D domain for simplicity and generate synthetic signals as samples from 1D sub-Gaussian random fields subordinated to truncated monofractal fractional Brownian motion (tfBm) or truncated fractional Gaussian noise (tfGn). Such sub-Gaussian fields are scale mixtures of stationary Gaussian fields with random variances that we model as being log-normal or Lévy α/2-stable. This novel interpretation of the data allows us to obtain maximum likelihood estimates of all parameters characterizing the underlying truncated sub-Gaussian fields. Based on synthetic data, we find these samples conform to the aforementioned scaling framework and confirm the effectiveness of generation schemes. We numerically investigate the manner in which variables, which scale according to the above scaling framework, behave at the tails of their distributions. Ours is the first study to explore the statistical scaling of extreme values, specifically peaks over thresholds or POTs, associated with such families of sub-Gaussian fields. Before closing this work, we apply and verify our analysis by investigating the scaling of statistics characterizing vertical increments in neutron porosity data, and POTs in absolute increments, from six deep boreholes in three different depositional environments.
|
6 |
Updating Rainfall Intensity-Duration-Frequency Curves in Sweden Accounting for the Observed Increase in Rainfall Extremes / Uppdatering av Intensitets-Varaktighetskurvor i Sverige med hänsyn till observera- de ökande trender av extrem nederbördEckersten, Sofia January 2016 (has links)
Increased extreme precipitation has been documented in many regions around the world, in- cluding central and northern Europe. Global warming increases average temperature, which in turn enhances atmospheric water holding capacity. These changes are believed to increase the frequency and/or intensity of extreme precipitation events. In determining the design storm, or a worst probable storm, for infrastructure design and failure risk assessment, experts commonly assume that statistics of extreme precipitation do not change significantly over time. This so- called notion of stationarity assumes that the statistics of future extreme precipitation events will be similar to those of historical observations. This study investigates the consequences of using a stationary assumption as well as the alternative: a non-stationary framework that con- siders temporal changes in statistics of extremes. Here we evaluate stationary and non-stationary return levels for 10-year to 50-year extreme precipitation events for different durations (1-day, 2-day, ..., 7-day precipitation events), based on the observed daily precipitation from Sweden. Non-stationary frequency analysis is only considered for stations with statistically significant trends over the past 50 years at 95% confidence (i.e., 15 to 39 % out of 139 stations, depend- ing on duration, 1-day, 2-day, ..., 7-day). We estimate non-stationary return levels using the General Extreme Value distribution with time-dependent parameters, inferred using a Bayesian approach. The estimated return levels are then compared in terms of duration, recurrence in- terval and location. The results indicate that a stationary assumption might, when a significant trend exists, underestimate extreme precipitation return levels by up to 40 % in Sweden. This report highlights the importance of considering better methods for estimating the recurrence in- terval of extreme events in a changing climate. This is particularly important for infrastructure design and risk reduction. / Ökad extrem nederbörd har dokumenterats globalt, däribland centrala och norra Europa. Den globala uppvärmningen medför en förhöjd medeltemperatur vilket i sin tur ökar avdunstning av vatten från ytor samt atmosfärens förmåga att hålla vatten. Dessa förändringar tros kunna öka och intensifiera nederbörd. Vid bestämning av dimensionerande nederbördsintensiteter för byggnationsprojekt antas idag att frekvensen och storleken av extrem nederbörd inte kommer att förändras i framtiden (stationäritet), vilket i praktiken innebär ingen förändring i klimatet. Den här studien syftar till att undersöka effekten av en icke-stationärt antagande vid skattning av dimensionerande nederbördsintensitet. Icke-stationära och stationära nerderbördsintensiteter föråterkomsttider mellan 10 och 100år bestämdes utifrån daglig och flerdaglig svensk nederbörds- data. Nederbördintensiteterna bestämdes med extremvärdesanalys i mjukvaran NEVA, där den generella extremvärdesfördelningen anpassades till årlig maximum nederbörd på platser i Sverige som påvisade en ökande trend under de senaste 50åren (15% till 39 % utav 139 stationer, beroende på varaktighet). De dimensionerande nederbördsintensiteterna jämfördes sedan med avseende på varaktighet, återkomsttid och plats. Resultaten indikerade på att ett stationärt antagande riskerar att underskatta dimensionerande nederbördsintensiteter för en viss återkomsttid med upp till 40 %. Detta indikerar att antagandet om icke-stationäritet har större betydelse för olika platser i Sverige, vilket skulle kunna ge viktig information vid bestämning av dimensionerande regnintensiteter.
|
7 |
Utilising probabilistic techniques in the assessment of extreme coastal flooding frequency-magnitude relationships using a case study from south-west EnglandWhitworth, Michael Robert Zordan January 2015 (has links)
Recent events such as the New Orleans floods and the Japanese tsunami of 2011 have highlighted the uncertainty in the quantification of the magnitude of natural hazards. The research undertaken here has focussed on the uncertainty in evaluating storm surge magnitudes based on a range of statistical techniques including the Generalised Extreme Value distribution, Joint Probability and Monte Carlo simulations. To support the evaluation of storm surge frequency magnitude relationships a unique hard copy observed sea level data set, recording hourly observations, was acquired and digitised for Devonport, Plymouth, creating a 40 year data set. In conjunction with Devonport data, Newlyn (1915-2012) tide gauge records were analysed, creating a data set of 2 million data points. The different statistical techniques analysed led to an uncertainty range of 0.4 m for a 1 in 250 year storm surge event, and 0.7 m for a 1 in 1000 storm surge event. This compares to a 0.5 m uncertainty range between the low and high prediction for sea level rise by 2100. The Geographical Information system modelling of the uncertainty indicated that for a 1 in 1000 year event the level uncertainty (0.7 m) led to an increase of 100% of buildings and 50% of total land affect. Within the study area of south-west England there are several critical structures including a nuclear licensed site. Incorporating the uncertainty in storm surge and wave height predictions indicated that the site would be potentially affected today with the combination of a 1 in 1000 year storm surge event coincident with a 1 in 1000 wave. In addition to the evaluation of frequency magnitude relations this study has identified several trends in the data set. Over the data period sea level rise is modelled as an exponential growth (0.0001mm/yr2), indicating the modelled sea level rise of 1.9 mm/yr and 2.2 mm/yr for Newlyn and Devonport, will potentially increase over the next century by a minimum of 0.2 m by 2100.The increase in storm frequency identified as part of this analysis has been equated to the rise in sea level, rather than an increase in the severity of storms, with decadal variations in the observed frequency, potentially linked to the North Atlantic Oscillation. The identification as part of this study of a significant uncertainty in the evaluation of storm surge frequency magnitude relationships has global significance in the evaluation of natural hazards. Guidance on the evaluation of external hazards currently does not adequately consider the effect of uncertainty; an uncertainty of 0.7 m identified within this study could potentially affect in the region of 500 million people worldwide living close to the coast.
|
8 |
Analýza nárazů větru na území České republiky / Analysis of wind gusts over the area of the Czech RepublicPop, Lukáš January 2015 (has links)
The Ph.D. thesis deals with extreme wind gust analysis over the area of the Czech Republic. The first part of the thesis deals with processing of wind measurements, in particular maximum wind gusts measurements. Analysis of high-frequency wind measurement using 3-D sonic anemometer on the Kopisty station is included. Homogenization of the highest daily wind gusts was performed. Descriptive statistical analysis of measured wind gust values was performed. The following part of the thesis describes statistical theory of extreme values and discusses its applicability to wind gust data. Some theoretical findings were obtained. Numerous numerical experiments were performed focused on evaluation of proposed method. In the last part of the thesis station measurements were processed using the proposed methods and a model of dependence between extreme and mean wind climate was derived. The model was applied to the map of mean wind climate calculated earlier on the Institute of Atmospheric Physics and thus a map of extreme wind climate was obtained. The accuracy of this map was estimated. The map was compared with other maps of extreme wind calculated by other authors earlier.
|
9 |
Utilisation des données historiques dans l'analyse régionale des aléas maritimes extrêmes : la méthode FAB / Using historical data in the Regional Analysis of extreme coastal events : the FAB methodFrau, Roberto 13 November 2018 (has links)
La protection des zones littorales contre les agressions naturelles provenant de la mer, et notamment contre le risque de submersion marine, est essentielle pour sécuriser les installations côtières. La prévention de ce risque est assurée par des protections côtières qui sont conçues et régulièrement vérifiées grâce généralement à la définition du concept de niveau de retour d’un événement extrême particulier. Le niveau de retour lié à une période de retour assez grande (de 1000 ans ou plus) est estimé par des méthodes statistiques basées sur la Théorie des Valeurs Extrêmes (TVE). Ces approches statistiques sont appliquées à des séries temporelles d’une variable extrême observée et permettent de connaître la probabilité d’occurrence de telle variable. Dans le passé, les niveaux de retour des aléas maritimes extrêmes étaient estimés le plus souvent à partir de méthodes statistiques appliquées à des séries d’observation locales. En général, les séries locales des niveaux marins sont observées sur une période limitée (pour les niveaux marins environ 50 ans) et on cherche à trouver des bonnes estimations des extrêmes associées à des périodes de retour très grandes. Pour cette raison, de nombreuses méthodologies sont utilisées pour augmenter la taille des échantillons des extrêmes et réduire les incertitudes sur les estimations. En génie côtier, une des approches actuellement assez utilisées est l’analyse régionale. L’analyse régionale est indiquée par Weiss (2014) comme une manière très performante pour réduire les incertitudes sur les estimations des événements extrêmes. Le principe de cette méthodologie est de profiter de la grande disponibilité spatiale des données observées sur différents sites pour créer des régions homogènes. Cela permet d’estimer des lois statistiques sur des échantillons régionaux plus étendus regroupant tous les événements extrêmes qui ont frappé un ou plusieurs sites de la région (...) Cela ainsi que le caractère particulier de chaque événement historique ne permet pas son utilisation dans une analyse régionale classique. Une méthodologie statistique appelée FAB qui permet de réaliser une analyse régionale tenant en compte les données historiques est développée dans ce manuscrit. Élaborée pour des données POT (Peaks Over Threshold), cette méthode est basée sur une nouvelle définition d’une durée d’observation, appelée durée crédible, locale et régionale et elle est capable de tenir en compte dans l’analyse statistique les trois types les plus classiques de données historiques (données ponctuelles, données définies par un intervalle, données au-dessus d’une borne inférieure). En plus, une approche pour déterminer un seuil d’échantillonnage optimal est définie dans cette étude. La méthode FAB est assez polyvalente et permet d’estimer des niveaux de retour soit dans un cadre fréquentiste soit dans un cadre bayésien. Une application de cette méthodologie est réalisée pour une base de données enregistrées des surcotes de pleine mer (données systématiques) et 14 surcotes de pleine mer historiques collectées pour différents sites positionnés le long des côtes françaises, anglaises, belges et espagnoles de l’Atlantique, de la Manche et de la mer du Nord. Enfin, ce manuscrit examine la problématique de la découverte et de la validation des données historiques / The protection of coastal areas against the risk of flooding is necessary to safeguard all types of waterside structures and, in particular, nuclear power plants. The prevention of flooding is guaranteed by coastal protection commonly built and verified thanks to the definition of the return level’s concept of a particular extreme event. Return levels linked to very high return periods (up to 1000 years) are estimated through statistical methods based on the Extreme Value Theory (EVT). These statistical approaches are applied to time series of a particular extreme variable observed and enables the computation of its occurrence probability. In the past, return levels of extreme coastal events were frequently estimated by applying statistical methods to time series of local observations. Local series of sea levels are typically observed in too short a period (for sea levels about 50 years) in order to compute reliable estimations linked to high return periods. For this reason, several approaches are used to enlarge the size of the extreme data samples and to reduce uncertainties of their estimations. Currently, one of the most widely used methods in coastal engineering is the Regional Analysis. Regional Analysis is denoted by Weiss (2014) as a valid means to reduce uncertainties in the estimations of extreme events. The main idea of this method is to take advantage of the wide spatial availability of observed data in different locations in order to form homogeneous regions. This enables the estimation of statistical distributions of enlarged regional data samples by clustering all extreme events occurred in one or more sites of the region. Recent investigations have highlighted the importance of using past events when estimating extreme events. When historical data are available, they cannot be neglected in order to compute reliable estimations of extreme events. Historical data are collected from different sources and they are identified as data that do not come from time series. In fact, in most cases, no information about other extreme events occurring before and after a historical observation is available. This, and the particular nature of each historical data, do not permit their use in a Regional Analysis. A statistical methodology that enables the use of historical data in a regional context is needed in order to estimate reliable return levels and to reduce their associated uncertainties. In this manuscript, a statistical method called FAB is developed enabling the performance of a Regional Analysis using historical data. This method is formulated for POT (Peaks Over Threshold) data. It is based on the new definition of duration of local and regional observation period (denominated credible duration) and it is able to take into account all the three typical kinds of historical data (exact point, range and lower limit value). In addition, an approach to identify an optimal sampling threshold is defined in this study. This allows to get better estimations through using the optimal extreme data sample in the FAB method.FAB method is a flexible approach that enables the estimation of return levels both in frequentist and Bayesian contexts. An application of this method is carried out for a database of recorded skew surges (systematic data) and for 14 historical skew surges recovered from different sites located on French, British, Belgian and Spanish coasts of the Atlantic Ocean, the English Channel and the North Sea. Frequentist and Bayesian estimations of skew surges are computed for each homogeneous region and for every site. Finally, this manuscript explores the issues surrounding the finding and validation of historical data
|
10 |
Climate variability and change impacts on coastal environmental variables in British Columbia, CanadaAbeysirigunawardena, Dilumie Saumedaka 29 April 2010 (has links)
The research presented in this dissertation attempted to determine whether climate variability is critical to sea level changes in coastal BC. To that end, a number of statistical models were proposed to clarify the relationships between five climate variability indices representing large-scale atmospheric circulation regimes and sea levels, storm surges, extreme winds and storm track variability in coastal BC. The research findings demonstrate that decadal to inter decadal climatic variability is fundamental to explaining the changing frequency and intensity of extreme atmospheric and oceanic environmental variables in coastal BC. The trends revealed by these analyses suggest that coastal flooding risks are certain to increase in this region during the next few decades, especially if the global sea-levels continue to rise as predicted. The out come of this study emphasis the need to look beyond climatic means when completing climate impact assessments, by clearly showing that climate extremes are currently causing the majority of weather-related damage along coastal BC. The findings highlight the need to derive knowledge on climate variability and change effects relevant at regional to local scales to enable useful adaptation strategies. The major findings of this research resulted in five independent manuscripts: (i) Sea level responses to climatic variability and change in Northern BC. The Manuscript (MC) is published in the Journal of atmospheric and oceans (AO 46 (3), 277-296); (ii) Extreme sea-level recurrences in the south coast of BC with climate considerations. This MC is in review with the Asia Pacific Journal of Climate Change (APJCC); (iii) Extreme sea-surge responses to climate variability in coastal BC. This MC is currently in review in the Annals of the AAG (AN-2009-0098); (iv) Extreme wind regime responses to climate variability and change in the inner-south-coast of BC. This MC is published in the Journal of Atmosphere and Oceans (AO 47 (1), 41-62); (v) Sensitivity of winter storm track characteristics in North-eastern Pacific to climate variability. This manuscript is in review with the Journal of Atmosphere and Oceans (AO (1113)). The findings of this research program made key contributions to the following regional sea level rise impact assessment studies in BC: (i) An examination of the Factors Affecting Relative and Absolute Sea level in coastal BC (Thomson et al., 2008). (ii) Coastal vulnerability to climate change and sea level rise, Northeast Graham Island, Haida Gwaii (formally known as the Queen Charlotte Islands), BC (Walker et al., 2007). (iii) Storm Surge: Atmospheric Hazards, Canadian Atmospheric Hazards Network - Pacific and Yukon Region, C/O Bill Taylor.
|
Page generated in 0.095 seconds