• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 11
  • 11
  • 11
  • 9
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A distribuição normal-valor extremo generalizado para a modelagem de dados limitados no intervalo unitá¡rio (0,1) / The normal-generalized extreme value distribution for the modeling of data restricted in the unit interval (0,1)

Benites, Yury Rojas 28 June 2019 (has links)
Neste trabalho é introduzido um novo modelo estatístico para modelar dados limitados no intervalo continuo (0;1). O modelo proposto é construído sob uma transformação de variáveis, onde a variável transformada é resultado da combinação de uma variável com distribuição normal padrão e a função de distribuição acumulada da distribuição valor extremo generalizado. Para o novo modelo são estudadas suas propriedades estruturais. A nova família é estendida para modelos de regressão, onde o modelo é reparametrizado na mediana da variável resposta e este conjuntamente com o parâmetro de dispersão são relacionados com covariáveis através de uma função de ligação. Procedimentos inferênciais são desenvolvidos desde uma perspectiva clássica e bayesiana. A inferência clássica baseia-se na teoria de máxima verossimilhança e a inferência bayesiana no método de Monte Carlo via cadeias de Markov. Além disso estudos de simulação foram realizados para avaliar o desempenho das estimativas clássicas e bayesianas dos parâmetros do modelo. Finalmente um conjunto de dados de câncer colorretal é considerado para mostrar a aplicabilidade do modelo. / In this research a new statistical model is introduced to model data restricted in the continuous interval (0;1). The proposed model is constructed under a transformation of variables, in which the transformed variable is the result of the combination of a variable with standard normal distribution and the cumulative distribution function of the generalized extreme value distribution. For the new model its structural properties are studied. The new family is extended to regression models, in which the model is reparametrized in the median of the response variable and together with the dispersion parameter are related to covariables through a link function. Inferential procedures are developed from a classical and Bayesian perspective. The classical inference is based on the theory of maximum likelihood, and the Bayesian inference is based on the Markov chain Monte Carlo method. In addition, simulation studies were performed to evaluate the performance of the classical and Bayesian estimates of the model parameters. Finally a set of colorectal cancer data is considered to show the applicability of the model
2

Modely a statistická analýza procesu rekordů / Models and statistical analysis of record processes

Tůmová, Alena January 2011 (has links)
In this work we model the historical development of best performances in men's 100, 200, 400 and 800m running events. We suppose that the years best performances are independent random variables with generalized extreme value distribution for minima and that there is a decreasing trend in location. Parameters of the models are estimated by using maximum likelihood techniques. The data of years best performances are missing for some years, we treat them as right censored data that are censored by value of world record valid at that time. Graphic tools used for models diagnostics are adjusted to the censoring. The models we get are used to estimate the ultimate records and to predict new records in next years. At the end of the work we estimate several models describing historical development of years best performances for more events at one time.
3

Metody modelování a statistické analýzy procesu extremálních hodnot / Methods of modelling and statistical analysis of an extremal value process

Jelenová, Klára January 2012 (has links)
In the present work we deal with the problem of etremal value of time series, especially of maxima. We study times and values of maximum by an approach of point process and we model distribution of extremal values by statistical methods. We estimate parameters of distribution using different methods, namely graphical methods of data analysis and subsequently we test the estimated distribution by tests of goodness of fit. We study the stationary case and also the cases with a trend. In connection with distribution of excesess and exceedances over a threshold we deal with generalized Pareto distribution.
4

Teorie extrémních hodnot v aktuárských vědách / Extreme Value Theory in Actuarial Sciences

Jamáriková, Zuzana January 2013 (has links)
This thesis is focused on the models based on extreme value theory and their practical applications. Specifically are described the block maxima models and the models based on threshold exceedances. Both of these methods are described in thesis theoretically. Apart from theoretical description there are also practical calculations based on simulated or real data. The applications of block maxima models are focused on choice of block size, suitability of the models for specific data and possibilities of extreme data analysis. The applications of models based on threshold exceedances are focused on choice of threshold and on suitability of the models. There is an example of the model used for calculations of reinsurance premium for extreme claims in the case of nonproportional reinsurance.
5

Modelování operačního rizika / Operational risk modelling

Mináriková, Eva January 2013 (has links)
In the present thesis we will firstly familiarize ourselves with the term of operational risk, it's definition presented in the directives Basel II and Solvency II, and afterwards with the methods of calculation Capital Requirements for Operational Risk, set by these directives. In the second part of the thesis we will concentrate on the methods of modelling operational loss data. We will introduce the Extreme Value Theory which describes possible approaches to modelling data with significant values that occur infrequently; the typical characteristic of operational risk data. We will mainly focus on the model for threshold exceedances which utilizes Generalized Pareto Distribution to model the distribution of those excesses. The teoretical knowledge of this theory and the appropriate modelling will be applied on simulated loss data. Finally we will test the ability of presented methods to model loss data distributions.
6

Construction of the Intensity-Duration-Frequency (IDF) Curves under Climate Change

2014 December 1900 (has links)
Intensity-Duration-Frequency (IDF) curves are among the standard design tools for various engineering applications, such as storm water management systems. The current practice is to use IDF curves based on historical extreme precipitation quantiles. A warming climate, however, might change the extreme precipitation quantiles represented by the IDF curves, emphasizing the need for updating the IDF curves used for the design of urban storm water management systems in different parts of the world, including Canada. This study attempts to construct the future IDF curves for Saskatoon, Canada, under possible climate change scenarios. For this purpose, LARS-WG, a stochastic weather generator, is used to spatially downscale the daily precipitation projected by Global Climate Models (GCMs) from coarse grid resolution to the local point scale. The stochastically downscaled daily precipitation realizations were further disaggregated into ensemble hourly and sub-hourly (as fine as 5-minute) precipitation series, using a disaggregation scheme developed using the K-nearest neighbor (K-NN) technique. This two-stage modeling framework (downscaling to daily, then disaggregating to finer resolutions) is applied to construct the future IDF curves in the city of Saskatoon. The sensitivity of the K-NN disaggregation model to the number of nearest neighbors (i.e. window size) is evaluated during the baseline period (1961-1990). The optimal window size is assigned based on the performance in reproducing the historical IDF curves by the K-NN disaggregation models. Two optimal window sizes are selected for the K-NN hourly and sub-hourly disaggregation models that would be appropriate for the hydrological system of Saskatoon. By using the simulated hourly and sub-hourly precipitation series and the Generalized Extreme Value (GEV) distribution, future changes in the IDF curves and associated uncertainties are quantified using a large ensemble of projections obtained for the Canadian and British GCMs (CanESM2 and HadGEM2-ES) based on three Representative Concentration Pathways; RCP2.6, RCP4.5, and RCP8.5 available from CMIP5 – the most recent product of the Intergovernmental Panel on Climate Change (IPCC). The constructed IDF curves are then compared with the ones constructed using another method based on a genetic programming technique. The results show that the sign and the magnitude of future variations in extreme precipitation quantiles are sensitive to the selection of GCMs and/or RCPs, and the variations seem to become intensified towards the end of the 21st century. Generally, the relative change in precipitation intensities with respect to the historical intensities for CMIP5 climate models (e.g., CanESM2: RCP4.5) is less than those for CMIP3 climate models (e.g., CGCM3.1: B1), which may be due to the inclusion of climate policies (i.e., adaptation and mitigation) in CMIP5 climate models. The two-stage downscaling-disaggregation method enables quantification of uncertainty due to natural internal variability of precipitation, various GCMs and RCPs, and downscaling methods. In general, uncertainty in the projections of future extreme precipitation quantiles increases for short durations and for long return periods. The two-stage method adopted in this study and the GP method reconstruct the historical IDF curves quite successfully during the baseline period (1961-1990); this suggests that these methods can be applied to efficiently construct IDF curves at the local scale under future climate scenarios. The most notable precipitation intensification in Saskatoon is projected to occur with shorter storm duration, up to one hour, and longer return periods of more than 25 years.
7

Downscaling estoc?stico para extremos clim?ticos via interpola??o espacial

Carvalho, Daniel Matos de 31 May 2010 (has links)
Made available in DSpace on 2014-12-17T15:26:38Z (GMT). No. of bitstreams: 1 DanielMC_DISSERT.pdf: 1549569 bytes, checksum: 5ad46f43cc6bf2e74f6fc1e20e5e2dc5 (MD5) Previous issue date: 2010-05-31 / Conselho Nacional de Desenvolvimento Cient?fico e Tecnol?gico / Present day weather forecast models usually cannot provide realistic descriptions of local and particulary extreme weather conditions. However, for lead times of about a small number of days, they provide reliable forecast of the atmospheric circulation that encompasses the subscale processes leading to extremes. Hence, forecasts of extreme events can only be achieved through a combination of dynamical and statistical analysis methods, where a stable and significant statistical model based on prior physical reasoning establishes posterior statistical-dynamical model between the local extremes and the large scale circulation. Here we present the development and application of such a statistical model calibration on the besis of extreme value theory, in order to derive probabilistic forecast for extreme local temperature. The dowscaling applies to NCEP/NCAR re-analysis, in order to derive estimates of daily temperature at Brazilian northeastern region weather stations / Os dados de rean?lise de temperatura do ar e precipita??o do NCEP National Centers for Environmental Predictions ser?o refinados para a produ??o dos n?veis de retorno para eventos extremos nas 8 capitais do Nordeste Brasileiro - NB: S?o Luis, Teresina, Fortaleza, Natal, Jo?o Pessoa, Recife, Macei?, Aracaju e Salvador. A grade do Ncep possui resolu??o espacial de 2.5? x 2.5? disponibilizando s?ries hist?ricas de 1948 a atualidade. Com esta resolu??o a grade envolve o NB utilizando 72 localiza??es (s?ries). A primeira etapa consiste em ajustar os modelos da Distribui??o Generalizada de Valores Extremos (GEV) e da Distribui??o Generalizada de Pareto (GPD) para cada ponto da grade. Utilizando o m?todo Geoestat?stico denominado Krigagem, os par?metros da GEV e GPD ser?o interpolados espacialmente. Considerando a interpola??o espacial dos par?metros, os n?veis de retorno para extremos de temperatura do ar e precipita??o poder?o ser obtidos aonde o NCEP n?o fornece informa??o relevante. Visando validar os resultados desta proposta, ser?o ajustados os modelos GEV e GPD as s?ries observacionais di?rias de temperatura e precipita??o de cada capital nordestina, e assim comparar com os resultados obtidos a partir da interpola??o espacial. Por fim o m?todo de Regress?o Quant?lica ser? utilizado como m?todo mais tradicional com a finalidade de compara??o de m?todos.
8

Fitting extreme value distributions to the Zambezi River flood water levels recorded at Katima Mulilo in Namibia (1965-2003)

Kamwi, Innocent Silibelo January 2005 (has links)
>Magister Scientiae - MSc / This study sought to identify and fit the appropriate extreme value distribution to flood data, using the method of maximum likelihood. To examine the uncertainty of the estimated parameters and evaluate the goodness of fit of the model identified. The study revealed that the three parameter Weibull and the generalised extreme value (GEV) distributions fit the data very well. Standard errors for the estimated parameters were calculated from the empirical information matrix. An upper limit to the flood levels followed from the fitted distribution.
9

Modeling Extreme Values / Modelování extrémních hodnot

Shykhmanter, Dmytro January 2013 (has links)
Modeling of extreme events is a challenging statistical task. Firstly, there is always a limit number of observations and secondly therefore no experience to back test the result. One way of estimating higher quantiles is to fit one of theoretical distributions to the data and extrapolate to the tail. The shortcoming of this approach is that the estimate of the tail is based on the observations in the center of distribution. Alternative approach to this problem is based on idea to split the data into two sub-populations and model body of the distribution separately from the tail. This methodology is applied to non-life insurance losses, where extremes are particularly important for risk management. Never the less, even this approach is not a conclusive solution of heavy tail modeling. In either case, estimated 99.5% percentiles have such high standard errors, that the their reliability is very low. On the other hand this approach is theoretically valid and deserves to be considered as one of the possible methods of extreme value analysis.
10

Modelling the Resilience of Offshore Renewable Energy System Using Non-constant Failure Rates

Beyene, Mussie Abraham January 2021 (has links)
Offshore renewable energy systems, such as Wave Energy Converters or an Offshore Wind Turbine, must be designed to withstand extremes of the weather environment. For this, it is crucial both to have a good understanding of the wave and wind climate at the intended offshore site, and of the system reaction and possible failures to different weather scenarios. Based on these considerations, the first objective of this thesis was to model and identify the extreme wind speed and significant wave height at an offshore site, based on measured wave and wind data. The extreme wind speeds and wave heights were characterized as return values after 10, 25, 50, and 100 years, using the Generalized Extreme Value method. Based on a literature review, fragility curves for wave and wind energy systems were identified as function of significant wave height and wind speed. For a wave energy system, a varying failure rate as function of the wave height was obtained from the fragility curves, and used to model the resilience of a wave energy farm as a function of the wave climate. The cases of non-constant and constant failure rates were compared, and it was found that the non-constant failure rate had a high impact on the wave energy farm's resilience. When a non-constant failure rate as a function of wave height was applied to the energy wave farm, the number of Wave Energy Converters available in the farm and the absorbed energy from the farm are nearly zero. The cases for non-constant and an averaged constant failure of the instantaneous non-constant failure rate as a function of wave height were also compared, and it was discovered that investigating the resilience of the wave energy farm using the averaged constant failure rate of the non-constant failure rate results in better resilience. So, based on the findings of this thesis, it is recommended that identifying and characterizing offshore extreme weather climates, having a high repair rate, and having a high threshold limit repair vessel to withstand the harsh offshore weather environment.

Page generated in 0.1095 seconds