• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 2
  • Tagged with
  • 11
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Nákaza na finančních trzích v zemích s možností přistoupení do Evropské unie / Coexceedance in financial markets of countries trying to join the European Union

Baranová, Zuzana January 2018 (has links)
This thesis analyses financial contagion between a reference EU market - Germany and markets of five countries which are actively seeking to become a part of European Union - Montenegro, Serbia, Turkey, Bosnia and Macedonia in the period of March 2006 to March 2018. We apply quantile regression framework to analyse contagion which we base on the occurrence and degree of coexceedances between the reference and analysed market. The results indicate that contagion between stock markets exists, however in different degree for each of the analysed markets. In addition we apply the regression framework specifically for period of financial crisis of 2008 to demonstrate that contagion is stronger during turbulent market periods. JEL Classification G01, G14, G15 Keywords coexceedance, quantile regression, contagion, stock markets Author's e-mail 80605682@fsv.cuni.cz Supervisor's e-mail roman.horvath@fsv.cuni.cz
2

Developmental of a Vapor Cloud Explosion Risk Analysis Tool Using Exceedance Methodology

Alghamdi, Salem 2011 August 1900 (has links)
In development projects, designers should take into consideration the possibility of a vapor cloud explosion in the siting and design of a process plant from day one. The most important decisions pertinent to the location of different process areas, separation between different areas, location of occupied buildings and overall layout may be made at the conceptual stage of the project. During the detailed design engineering stage the final calculation of gas explosion loads is an important activity. However, decisions related to the layout and location of occupied buildings at this stage could be very costly. Therefore, at the conceptual phase of the development project for a hydrocarbon facility, it would be helpful to get a picture of possible vapor cloud explosion loads to be used in studying various options. This thesis presents the analytical parameters that are used in vapor cloud explosion risk analysis. It proposes a model structure for the analysis of vapor cloud explosion risks to buildings based on exceedance methodology. This methodology was developed in a computer program which is used to support this thesis. The proposed model considers all possible gas release scenarios through the use of the Monte Carlo simulation. The risk of vapor cloud explosions can be displayed using exceedance curves. The resulting model provides a predictive tool for vapor cloud explosion problems at the early stages of development projects, particularly in siting occupied buildings in onshore hydrocarbon facilities. It can also be used as a quick analytical tool for investigating various aspects of vapor cloud explosions. This model has been applied to a case study, a debutanizer process unit. The model was used to explore the different alternatives of locating a building near the facility. The results from the model were compared to the results of other existing software to determine the model validity. The results show that the model can effectively examine the risk of vapor cloud explosions.
3

Nákaza mezi akciemi a dluhopisy na finančních trzích jihovýchodní a střední Evropy / Co-exceedances in stocks and bonds between Southern European Countries and CEE Countries - Analysis of contagion

Pjontek, Matej January 2017 (has links)
In this thesis, we analyse financial contagion between Southern European (Greek, Italian, Portuguese and Spanish) and Central Eastern European (Czech, Polish and Hungarian) stock markets respectively sovereign bond markets in the period from January 2001 to June 2016. A quantile regression framework is applied to analyse contagion based on measuring of occurrences and degrees of co-exceedances. We use conditional variance (volatility) of analysed markets to find direction of the contagion. Our results show that during the analysed period contagion between stock markets exists. Contagion between stock markets is stronger during the financial and sovereign debt crisis. Direction of contagion is from Southern European to Central Eastern European Countries. We do not find evidence of contagion between Sothern European and Central Eastern European sovereign bond markets. Our results show "flight to quality", but not "flight from quality".
4

INCREASES IN THE NUMBER OF FROST- FREE DAYS ACROSS CANADA DURING WINTER PERIOD

Manzoor, Shaukat 08 1900 (has links)
An annual analysis of data from 11 weather stations across Canada was examined for temporal changes in the number of frost free days (FFDs) per winter. From 1940 to 2009, Canada has experienced an increase in the number of FFDs with an average of 12 days in 100 years. The numbers of FFDs per winter increased exponentially when mean WDMT rises steadily. The best fit normal distribution function of WDMT in recent decades has shifted toward the hotter part of the curve. The predicted numbers of FFDs per winter from the exceedance probability (EP) has increased exponentially when mean WDMT rise steadily. The EP model displayed a close comparison between the predicted and the measured number of FFDs. The EP model also provides a useful tool for making predictions about the increase in the number of frost-free days at any single station, given the rate of rise in minimum temperatures
5

Annual Exceedance Probability Analysis

Gardner, Masako Amai 14 July 2005 (has links) (PDF)
Annual Exceedance Probability (AEP) is the method used by U.S. Army Corps of Engineers (USACE) to determine the probability of flooding caused by the failure of a levee or other flood control structure. This method shows the probability of flooding only at one particular location at a time. In order to overcome the limitation of AEP, a new method of studying flood probability, called an AEP map, was presented. By using hydrologic and hydraulic modeling software, an AEP map can be created to determine and visualize the spatial distribution of the probability of flooding. An AEP map represents a continuous solution of the probability of flooding and can be used to derive not only the limits of the typical 100-year inundation, but any other return period including the 20-year, 50-year, 500-year storm flood. The AEP map can be more useful than traditional flood hazard maps, since it makes it possible to evaluate the probability of flooding at any location within the floodplain. In the process of creating the AEP map, it is necessary to run number of simulations in order to accurately represent the probability distribution of flooding. The objective of this research, given a desktop computer of today's capacity, is to demonstrate the convergence of AEP maps after a reasonable number of simulations, so that users can have some guidelines to decide how many simulations are necessary. The Virgin River, UT is the primary study area for this research, with Gila River, AZ also used to support the results. The result of this research demonstrates the convergence of AEP maps by illustrating the convergence of water surface elevations computed as part of the hydraulic simulation leading up to the floodplain delineation model. If the average water surface elevations converge, then the resulting floodplain delineation (AEP maps) should also converge. The result proves that AEP maps do converge with a reasonable number of simulations. This research also shows the convergence of floodplain areas to demonstrate the convergence of AEP maps.
6

Modelling heavy rainfall over time and space

Khuluse, Sibusisiwe Audrey 06 June 2011 (has links)
Extreme Value Theory nds application in problems concerning low probability but high consequence events. In hydrology the study of heavy rainfall is important in regional ood risk assessment. In particular, the N-year return level is a key output of an extreme value analysis, hence care needs to be taken to ensure that the model is accurate and that the level of imprecision in the parameter estimates is made explicit. Rainfall is a process that evolves over time and space. Therefore, it is anticipated that at extreme levels the process would continue to show temporal and spatial correlation. In this study interest is in whether any trends in heavy rainfall can be detected for the Western Cape. The focus is on obtaining the 50-year daily winter rainfall return level and investigating whether this quantity is homogenous over the study area. The study is carried out in two stages. In the rst stage, the point process approach to extreme value theory is applied to arrive at the return level estimates at each of the fteen sites. Stationarity is assumed for the series at each station, thus an issue to deal with is that of short-range temporal correlation of threshold exceedances. The proportion of exceedances is found to be smaller (approximately 0.01) for stations towards the east such as Jonkersberg, Plettenbergbay and Tygerhoek. This can be attributed to rainfall values being mostly low, with few instances where large amounts of rainfall were observed. Looking at the parameters of the point process extreme value model, the location parameter estimate appears stable over the region in contrast to the scale parameter estimate which shows an increase towards in a south easterly direction. While the model is shown to t exceedances at each station adequately, the degree of uncertainty is large for stations such as Tygerhoek, where the maximum observed rainfall value is approximately twice as large as the high rainfall values. This situation was also observed at other stations and in such cases removal of these high rainfall values was avoided to minimize the risk of obtaining inaccurate return level estimates. The key result is an N-year rainfall return level estimate at each site. Interest is in mapping an estimate of the 50-year daily winter rainfall return level, however to evaluate the adequacy of the model at each site the 25-year return level is considered since a 25 year return period is well within the range of the observed data. The 25-year daily winter rainfall return level estimate for Ladismith is the smallest at 22:42 mm. This can be attributed to the station's generally low observed winter rainfall values. In contrast, the return level estimate for Tygerhoek is high, almost six times larger than that of Ladismith at 119:16 mm. Visually design values show di erences between sites, therefore it is of interest to investigate whether these di erences can be modelled. The second stage is the geostatistical analysis of the 50-year 24-hour rainfall return level The aim here is to quantify the degree of spatial variation in the 50-year 24-hour rainfall return level estimates and to use that association to predict values at unobserved sites within the study region. A tool for quantifying spatial variation is the variogram model. Estimation of the parameters of this model require a su ciently large sample, which is a challenge in this study since there is only fteen stations and therefore only fteen observations for the geostatistical analysis. To address this challenge, observations are expanded in space and time and then standardized and to create a larger pool of data from which the variogram is estimated. The obtained estimates are used in ordinary and universal kriging to derive the 50-year 24-hour winter rainfall return level maps. It is shown that 50-year daily winter design rainfall over most of the Western Cape lies between 40 mm and 80 mm, but rises sharply as one moves towards the east coast of the region. This is largely due to the in uence of large design values obtained for Tygerhoek. In ordinary kriging prediction uncertainty is lowest around observed values and is large if the distance from these points increases. Overall, prediction uncertainty maps show that ordinary kriging performs better than universal kriging where a linear regional trend in design values is included.
7

Risk Measures Constituting Risk Metrics for Decision Making in the Chemical Process Industry

Prem, Katherine 2010 December 1900 (has links)
The occurrence of catastrophic incidents in the process industry leave a marked legacy of resulting in staggering economic and societal losses incurred by the company, the government and the society. The work described herein is a novel approach proposed to help predict and mitigate potential catastrophes from occurring and for understanding the stakes at risk for better risk informed decision making. The methodology includes societal impact as risk measures along with tangible asset damage monetization. Predicting incidents as leading metrics is pivotal to improving plant processes and, for individual and societal safety in the vicinity of the plant (portfolio). From this study it can be concluded that the comprehensive judgments of all the risks and losses should entail the analysis of the overall results of all possible incident scenarios. Value-at-Risk (VaR) is most suitable as an overall measure for many scenarios and for large number of portfolio assets. FN-curves and F$-curves can be correlated and this is very beneficial for understanding the trends of historical incidents in the U.S. chemical process industry. Analyzing historical databases can provide valuable information on the incident occurrences and their consequences as lagging metrics (or lagging indicators) for the mitigation of the portfolio risks. From this study it can be concluded that there is a strong statistical relationship between the different consequence tiers of the safety pyramid and Heinrich‘s safety pyramid is comparable to data mined from the HSEES database. Furthermore, any chemical plant operation is robust only when a strategic balance is struck between optimal plant operations and, maintaining health, safety and sustaining environment. The balance emerges from choosing the best option amidst several conflicting parameters. Strategies for normative decision making should be utilized for making choices under uncertainty. Hence, decision theory is utilized here for laying the framework for choice making of optimum portfolio option among several competing portfolios. For understanding the strategic interactions of the different contributing representative sets that play a key role in determining the most preferred action for optimum production and safety, the concepts of game theory are utilized and framework has been provided as novel application to chemical process industry.
8

Exceedance Frequency Analysis of Urban Stormwater Quality and Its Relation to Land Use Change, Denton, Texas

Shrestha, Manjul 08 1900 (has links)
Urbanization causes various environmental issues including water pollution, air pollution, and solid waste. Urbanization of watersheds has a profound influence on the quality of stormwater runoff. The quality of stormwater runoff is highly associated with land use. This study analyzed the exceedance frequency of stormwater quality in five watersheds of Denton over eleven years and also analyzed the relationship between stormwater quality and land use/cover of each watershed. The results showed that the most of the water quality parameters that were examined in the Lower Pecan watershed exceeded their threshold most frequently. The higher frequency of exceedance in this watershed can be attributed to the wastewater treatment plant and landfill site. Total suspended solids and turbidity were frequently exceeded in Hickory and Clear Creek watersheds. Conductivity was found to have highest percentage of exceedance in Upper Pecan and Cooper watersheds. Thus, rural watersheds were related with higher exceedance of TSS and turbidity whereas urban watersheds were related with higher exceedance of conductivity.
9

Modelling the rejection probability of a quality test consisting of multiple measurements

Tamminen, S. (Satu) 02 September 2014 (has links)
Abstract Quality control is an essential part of manufacturing, and the different properties of the products can be tested with standardized methods. If the decision of qualification is based on only one test specimen representing a batch of products, the testing procedure is quite straightforward. However, when the measured property has a high variability within the product, as usual, several test specimens are needed for the quality verification. When a quality property is predicted, the response value of the model that most effectively finds the critical observations should naturally be selected. In this thesis, it has been shown that LIB-transformation (Larger Is Better) is a suitable method for multiple test samples, because it effectively recognizes especially the situations where one of the measurements is very low. The main contribution of this thesis is to show how to model quality of phenomena that consist of several measurement samples for each observation. The process contains several steps, beginning from the selection of the model type. Prediction of the exceedance probability provides more information for the decision making than that of the mean. Especially with the selected application, where the quality property has no optimal value, but the interest is in adequately high value, this approach is more natural. With industrial applications, the assumption of constant variance should be analysed critically. In this thesis, it is shown that exceedance probability modelling can benefit from the use of an additional variance model together with a mean model in prediction. The distribution shape modelling improves the model further, when the response variable may not be Gaussian. As the proposed methods are fundamentally different, the model selection criteria have to be chosen with caution. Different methods for model selection were considered and commented, and EPS (Exceedance Probability Score) was chosen, because it is most suitable for probability predictors. This thesis demonstrates that especially a process with high diversity in its production and more challenging distribution shape gains from the deviation modelling, and the results can be improved further with the distribution shape modelling. / Tiivistelmä Laadunvalvonnalla on keskeinen rooli teollisessa tuotannossa. Valmistettavan tuotteen erilaisia ominaisuuksia mitataan standardin mukaisilla testausmenetelmillä. Testi on yksinkertainen, jos tuotteen laatu varmistetaan vain yhdellä testikappaleella. Kun testattava ominaisuus voi saada hyvin vaihtelevia tuloksia samastakin tuotteesta, tarvitaan useita testikappaleita laadun varmistamiseen. Tuotteen laatuominaisuuksia ennustettaessa valitaan malliin vastemuuttuja, joka tehokkaimmin tunnistaa laadun kannalta kriittiset havainnot. Tässä väitöskirjassa osoitetaan, että LIB-transformaatio (Large Is Better) tunnistaa tehokkaasti erityisesti tilanteet, joissa yksi mittauksista on hyvin matala. Tämän väitöskirja vastaa kysymykseen, kuinka mallintaa laatua, kun tutkittavasta tuotteesta tarvitaan useita testinäytteitä. Mallinnusprosessi koostuu useista vaiheista alkaen mallityypin valinnasta. Alitusriskin mallinnuksen avulla saadaan enemmän informaatiota päätöksenteon tueksi perinteisen odotusarvomallinnuksen sijaan, etenkin jos laatutekijältä vaaditaan vain riittävän hyvää tasoa optimiarvon sijaan. Teollisissa sovelluksissa ei voida useinkaan olettaa, että vasteen hajonta olisi vakio läpi prosessin. Tässä väitöskirjassa osoitetaan että alitusriskin ennustamistarkkuus paranee, kun odotusarvon lisäksi mallinnetaan myös hajontaa. Jakaumamuodon mallilla voidaan parantaa ennustetarkkuutta silloin, kun vastemuuttuja ei noudata Gaussin jakaumaa. Koska ehdotetut mallit ovat perustaltaan erilaisia, täytyy myös mallin valintakriteeri valita huolella. Työssä osoitetaan, että EPS (Exceedance Probability Score) toimii parhaiten käytetyillä todennäköisyyttä ennustavilla malleilla. Tässä väitöskirjassa osoitetaan, että erityisesti silloin kun tuotantoprosessi on monimuotoinen ja laatumuuttujan jakaumamuoto on haastava, mallinnuttaminen hyötyy hajontamallin käytöstä, ja tuloksia voidaan parantaa jakaumamuodon mallilla.
10

Tail Risk Protection via reproducible data-adaptive strategies

Spilak, Bruno 15 February 2024 (has links)
Die Dissertation untersucht das Potenzial von Machine-Learning-Methoden zur Verwaltung von Schwanzrisiken in nicht-stationären und hochdimensionalen Umgebungen. Dazu vergleichen wir auf robuste Weise datenabhängige Ansätze aus parametrischer oder nicht-parametrischer Statistik mit datenadaptiven Methoden. Da datengetriebene Methoden reproduzierbar sein müssen, um Vertrauen und Transparenz zu gewährleisten, schlagen wir zunächst eine neue Plattform namens Quantinar vor, die einen neuen Standard für wissenschaftliche Veröffentlichungen setzen soll. Im zweiten Kapitel werden parametrische, lokale parametrische und nicht-parametrische Methoden verglichen, um eine dynamische Handelsstrategie für den Schutz vor Schwanzrisiken in Bitcoin zu entwickeln. Das dritte Kapitel präsentiert die Portfolio-Allokationsmethode NMFRB, die durch eine Dimensionsreduktionstechnik hohe Dimensionen bewältigt. Im Vergleich zu klassischen Machine-Learning-Methoden zeigt NMFRB in zwei Universen überlegene risikobereinigte Renditen. Das letzte Kapitel kombiniert bisherige Ansätze zu einer Schwanzrisikoschutzstrategie für Portfolios. Die erweiterte NMFRB berücksichtigt Schwanzrisikomaße, behandelt nicht-lineare Beziehungen zwischen Vermögenswerten während Schwanzereignissen und entwickelt eine dynamische Schwanzrisikoschutzstrategie unter Berücksichtigung der Nicht-Stationarität der Vermögensrenditen. Die vorgestellte Strategie reduziert erfolgreich große Drawdowns und übertrifft andere moderne Schwanzrisikoschutzstrategien wie die Value-at-Risk-Spread-Strategie. Die Ergebnisse werden durch verschiedene Data-Snooping-Tests überprüft. / This dissertation shows the potential of machine learning methods for managing tail risk in a non-stationary and high-dimensional setting. For this, we compare in a robust manner data-dependent approaches from parametric or non-parametric statistics with data-adaptive methods. As these methods need to be reproducible to ensure trust and transparency, we start by proposing a new platform called Quantinar, which aims to set a new standard for academic publications. In the second chapter, we dive into the core subject of this thesis which compares various parametric, local parametric, and non-parametric methods to create a dynamic trading strategy that protects against tail risk in Bitcoin cryptocurrency. In the third chapter, we propose a new portfolio allocation method, called NMFRB, that deals with high dimensions thanks to a dimension reduction technique, convex Non-negative Matrix Factorization. This technique allows us to find latent interpretable portfolios that are diversified out-of-sample. We show in two universes that the proposed method outperforms other classical machine learning-based methods such as Hierarchical Risk Parity (HRP) concerning risk-adjusted returns. We also test the robustness of our results via Monte Carlo simulation. Finally, the last chapter combines our previous approaches to develop a tail-risk protection strategy for portfolios: we extend the NMFRB to tail-risk measures, we address the non-linear relationships between assets during tail events by developing a specific non-linear latent factor model, finally, we develop a dynamic tail risk protection strategy that deals with the non-stationarity of asset returns using classical econometrics models. We show that our strategy is successful at reducing large drawdowns and outperforms other modern tail-risk protection strategies such as the Value-at-Risk-spread strategy. We verify our findings by performing various data snooping tests.

Page generated in 0.0577 seconds