Spelling suggestions: "subject:"exceedance probability"" "subject:"exceedances probability""
1 |
INCREASES IN THE NUMBER OF FROST- FREE DAYS ACROSS CANADA DURING WINTER PERIODManzoor, Shaukat 08 1900 (has links)
An annual analysis of data from 11 weather stations across Canada was examined for temporal changes in the number of frost free days (FFDs) per winter. From 1940 to 2009, Canada has experienced an increase in the number of FFDs with an average of 12 days in 100 years. The numbers of FFDs per winter increased exponentially when mean WDMT rises steadily. The best fit normal distribution function of WDMT in recent decades has shifted toward the hotter part of the curve.
The predicted numbers of FFDs per winter from the exceedance probability (EP) has increased exponentially when mean WDMT rise steadily. The EP model displayed a close comparison between the predicted and the measured number of FFDs. The EP model also provides a useful tool for making predictions about the increase in the number of frost-free days at any single station, given the rate of rise in minimum temperatures
|
2 |
Annual Exceedance Probability AnalysisGardner, Masako Amai 14 July 2005 (has links) (PDF)
Annual Exceedance Probability (AEP) is the method used by U.S. Army Corps of Engineers (USACE) to determine the probability of flooding caused by the failure of a levee or other flood control structure. This method shows the probability of flooding only at one particular location at a time. In order to overcome the limitation of AEP, a new method of studying flood probability, called an AEP map, was presented. By using hydrologic and hydraulic modeling software, an AEP map can be created to determine and visualize the spatial distribution of the probability of flooding. An AEP map represents a continuous solution of the probability of flooding and can be used to derive not only the limits of the typical 100-year inundation, but any other return period including the 20-year, 50-year, 500-year storm flood. The AEP map can be more useful than traditional flood hazard maps, since it makes it possible to evaluate the probability of flooding at any location within the floodplain. In the process of creating the AEP map, it is necessary to run number of simulations in order to accurately represent the probability distribution of flooding. The objective of this research, given a desktop computer of today's capacity, is to demonstrate the convergence of AEP maps after a reasonable number of simulations, so that users can have some guidelines to decide how many simulations are necessary. The Virgin River, UT is the primary study area for this research, with Gila River, AZ also used to support the results. The result of this research demonstrates the convergence of AEP maps by illustrating the convergence of water surface elevations computed as part of the hydraulic simulation leading up to the floodplain delineation model. If the average water surface elevations converge, then the resulting floodplain delineation (AEP maps) should also converge. The result proves that AEP maps do converge with a reasonable number of simulations. This research also shows the convergence of floodplain areas to demonstrate the convergence of AEP maps.
|
3 |
Modelling the rejection probability of a quality test consisting of multiple measurementsTamminen, S. (Satu) 02 September 2014 (has links)
Abstract
Quality control is an essential part of manufacturing, and the different properties of the products can be tested with standardized methods. If the decision of qualification is based on only one test specimen representing a batch of products, the testing procedure is quite straightforward. However, when the measured property has a high variability within the product, as usual, several test specimens are needed for the quality verification.
When a quality property is predicted, the response value of the model that most effectively finds the critical observations should naturally be selected. In this thesis, it has been shown that LIB-transformation (Larger Is Better) is a suitable method for multiple test samples, because it effectively recognizes especially the situations where one of the measurements is very low.
The main contribution of this thesis is to show how to model quality of phenomena that consist of several measurement samples for each observation. The process contains several steps, beginning from the selection of the model type. Prediction of the exceedance probability provides more information for the decision making than that of the mean. Especially with the selected application, where the quality property has no optimal value, but the interest is in adequately high value, this approach is more natural.
With industrial applications, the assumption of constant variance should be analysed critically. In this thesis, it is shown that exceedance probability modelling can benefit from the use of an additional variance model together with a mean model in prediction. The distribution shape modelling improves the model further, when the response variable may not be Gaussian. As the proposed methods are fundamentally different, the model selection criteria have to be chosen with caution. Different methods for model selection were considered and commented, and EPS (Exceedance Probability Score) was chosen, because it is most suitable for probability predictors.
This thesis demonstrates that especially a process with high diversity in its production and more challenging distribution shape gains from the deviation modelling, and the results can be improved further with the distribution shape modelling. / Tiivistelmä
Laadunvalvonnalla on keskeinen rooli teollisessa tuotannossa. Valmistettavan tuotteen erilaisia ominaisuuksia mitataan standardin mukaisilla testausmenetelmillä. Testi on yksinkertainen, jos tuotteen laatu varmistetaan vain yhdellä testikappaleella. Kun testattava ominaisuus voi saada hyvin vaihtelevia tuloksia samastakin tuotteesta, tarvitaan useita testikappaleita laadun varmistamiseen.
Tuotteen laatuominaisuuksia ennustettaessa valitaan malliin vastemuuttuja, joka tehokkaimmin tunnistaa laadun kannalta kriittiset havainnot. Tässä väitöskirjassa osoitetaan, että LIB-transformaatio (Large Is Better) tunnistaa tehokkaasti erityisesti tilanteet, joissa yksi mittauksista on hyvin matala.
Tämän väitöskirja vastaa kysymykseen, kuinka mallintaa laatua, kun tutkittavasta tuotteesta tarvitaan useita testinäytteitä. Mallinnusprosessi koostuu useista vaiheista alkaen mallityypin valinnasta. Alitusriskin mallinnuksen avulla saadaan enemmän informaatiota päätöksenteon tueksi perinteisen odotusarvomallinnuksen sijaan, etenkin jos laatutekijältä vaaditaan vain riittävän hyvää tasoa optimiarvon sijaan.
Teollisissa sovelluksissa ei voida useinkaan olettaa, että vasteen hajonta olisi vakio läpi prosessin. Tässä väitöskirjassa osoitetaan että alitusriskin ennustamistarkkuus paranee, kun odotusarvon lisäksi mallinnetaan myös hajontaa. Jakaumamuodon mallilla voidaan parantaa ennustetarkkuutta silloin, kun vastemuuttuja ei noudata Gaussin jakaumaa. Koska ehdotetut mallit ovat perustaltaan erilaisia, täytyy myös mallin valintakriteeri valita huolella. Työssä osoitetaan, että EPS (Exceedance Probability Score) toimii parhaiten käytetyillä todennäköisyyttä ennustavilla malleilla.
Tässä väitöskirjassa osoitetaan, että erityisesti silloin kun tuotantoprosessi on monimuotoinen ja laatumuuttujan jakaumamuoto on haastava, mallinnuttaminen hyötyy hajontamallin käytöstä, ja tuloksia voidaan parantaa jakaumamuodon mallilla.
|
4 |
Tail Risk Protection via reproducible data-adaptive strategiesSpilak, Bruno 15 February 2024 (has links)
Die Dissertation untersucht das Potenzial von Machine-Learning-Methoden zur Verwaltung von Schwanzrisiken in nicht-stationären und hochdimensionalen Umgebungen. Dazu vergleichen wir auf robuste Weise datenabhängige Ansätze aus parametrischer oder nicht-parametrischer Statistik mit datenadaptiven Methoden. Da datengetriebene Methoden reproduzierbar sein müssen, um Vertrauen und Transparenz zu gewährleisten, schlagen wir zunächst eine neue Plattform namens Quantinar vor, die einen neuen Standard für wissenschaftliche Veröffentlichungen setzen soll. Im zweiten Kapitel werden parametrische, lokale parametrische und nicht-parametrische Methoden verglichen, um eine dynamische Handelsstrategie für den Schutz vor Schwanzrisiken in Bitcoin zu entwickeln. Das dritte Kapitel präsentiert die Portfolio-Allokationsmethode NMFRB, die durch eine Dimensionsreduktionstechnik hohe Dimensionen bewältigt. Im Vergleich zu klassischen Machine-Learning-Methoden zeigt NMFRB in zwei Universen überlegene risikobereinigte Renditen. Das letzte Kapitel kombiniert bisherige Ansätze zu einer Schwanzrisikoschutzstrategie für Portfolios. Die erweiterte NMFRB berücksichtigt Schwanzrisikomaße, behandelt nicht-lineare Beziehungen zwischen Vermögenswerten während Schwanzereignissen und entwickelt eine dynamische Schwanzrisikoschutzstrategie unter Berücksichtigung der Nicht-Stationarität der Vermögensrenditen. Die vorgestellte Strategie reduziert erfolgreich große Drawdowns und übertrifft andere moderne Schwanzrisikoschutzstrategien wie die Value-at-Risk-Spread-Strategie. Die Ergebnisse werden durch verschiedene Data-Snooping-Tests überprüft. / This dissertation shows the potential of machine learning methods for managing tail risk in a non-stationary and high-dimensional setting. For this, we compare in a robust manner data-dependent approaches from parametric or non-parametric statistics with data-adaptive methods. As these methods need to be reproducible to ensure trust and transparency, we start by proposing a new platform called Quantinar, which aims to set a new standard for academic publications. In the second chapter, we dive into the core subject of this thesis which compares various parametric, local parametric, and non-parametric methods to create a dynamic trading strategy that protects against tail risk in Bitcoin cryptocurrency. In the third chapter, we propose a new portfolio allocation method, called NMFRB, that deals with high dimensions thanks to a dimension reduction technique, convex Non-negative Matrix Factorization. This technique allows us to find latent interpretable portfolios that are diversified out-of-sample. We show in two universes that the proposed method outperforms other classical machine learning-based methods such as Hierarchical Risk Parity (HRP) concerning risk-adjusted returns. We also test the robustness of our results via Monte Carlo simulation. Finally, the last chapter combines our previous approaches to develop a tail-risk protection strategy for portfolios: we extend the NMFRB to tail-risk measures, we address the non-linear relationships between assets during tail events by developing a specific non-linear latent factor model, finally, we develop a dynamic tail risk protection strategy that deals with the non-stationarity of asset returns using classical econometrics models. We show that our strategy is successful at reducing large drawdowns and outperforms other modern tail-risk protection strategies such as the Value-at-Risk-spread strategy. We verify our findings by performing various data snooping tests.
|
5 |
The Simulation & Evaluation of Surge Hazard Using a Response Surface Method in the New York BightBredesen, Michael H 01 January 2015 (has links)
Atmospheric features, such as tropical cyclones, act as a driving mechanism for many of the major hazards affecting coastal areas around the world. Accurate and efficient quantification of tropical cyclone surge hazard is essential to the development of resilient coastal communities, particularly given continued sea level trend concerns. Recent major tropical cyclones that have impacted the northeastern portion of the United States have resulted in devastating flooding in New York City, the most densely populated city in the US. As a part of national effort to re-evaluate coastal inundation hazards, the Federal Emergency Management Agency used the Joint Probability Method to re-evaluate surge hazard probabilities for Flood Insurance Rate Maps in the New York – New Jersey coastal areas, also termed the New York Bight. As originally developed, this method required many combinations of storm parameters to statistically characterize the local climatology for numerical model simulation. Even though high-performance computing efficiency has vastly improved in recent years, researchers have utilized different “Optimal Sampling” techniques to reduce the number of storm simulations needed in the traditional Joint Probability Method. This manuscript presents results from the simulation of over 350 synthetic tropical cyclones designed to produce significant surge in the New York Bight using the hydrodynamic Advanced Circulation numerical model, bypassing the need for Optimal Sampling schemes. This data set allowed for a careful assessment of joint probability distributions utilized for this area and the impacts of current assumptions used in deriving new flood-risk maps for the New York City area.
|
Page generated in 0.0917 seconds