• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 6
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 28
  • 28
  • 24
  • 19
  • 15
  • 9
  • 8
  • 6
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Fire in the southern U.S: administrative laws and regulations in the Southeast and wildfire distribution in Mississippi

Tolver, Branden 07 August 2010 (has links)
Wildfires in the United States present a complexity of problems for private landowners and policy makers. This thesis takes a look at two key issues faced by private and government stakeholders; the first being a lack of knowledge regarding current prescribed fire laws and regulations. A legal review of administrative laws and regulations for prescribed burning in the Southeastern United States in the context of management-based regulation is used to address this issue. It was found that regulation for prescribed burning has shifted to a more management–based regime. The second is an empirical study of wildfire distribution in the state of Mississippi. Wildfires appear to fit a Pareto distribution throughout the state given a certain threshold. When analyzed in conjunction both studies could aid lawmakers in projecting the effects of a given policy change on actual wildfire occurrences and distribution.
2

An empirical comparison of extreme value modelling procedures for the estimation of high quantiles

Engberg, Alexander January 2016 (has links)
The peaks over threshold (POT) method provides an attractive framework for estimating the risk of extreme events such as severe storms or large insurance claims. However, the conventional POT procedure, where the threshold excesses are modelled by a generalized Pareto distribution, suffers from small samples and subjective threshold selection. In recent years, two alternative approaches have been proposed in the form of mixture models that estimate the threshold and a folding procedure that generates larger tail samples. In this paper the empirical performances of the conventional POT procedure, the folding procedure and a mixture model are compared by modelling data sets on fire insurance claims and hurricane damage costs. The results show that the folding procedure gives smaller standard errors of the parameter estimates and in some cases more stable quantile estimates than the conventional POT procedure. The mixture model estimates are dependent on the starting values in the numerical maximum likelihood estimation, and are therefore difficult to compare with those from the other procedures. The conclusion is that none of the procedures is overall better than the others but that there are situations where one method may be preferred.
3

Modelování velkých škod / Modelování velkých škod

Zuzáková, Barbora January 2013 (has links)
Title: Large claims modeling Author: Barbora Zuzáková Department: Department of Probability and Mathematical Statistics Supervisor: RNDr. Michal Pešta, Ph.D. Abstract: This thesis discusses a statistical modeling approach based on the extreme value theory to describe the behaviour of large claims of an insurance portfolio. We focus on threshold models which analyze exceedances of a high threshold. This approach has gained in popularity in recent years, as compared with the much older methods based directly on the extreme value distributions. The method is illustated using the group medical claims database recorded over the periods 1997, 1998 and 1999 maintained by the Society of Actuaries. We aim to demonstrate that the proposed model outperforms classical parametric distri- butions and thus enables to estimate high quantiles or the probable maximum loss more precisely. Keywords: threshold models, generalized Pareto distribution, large claims. 1
4

A distribuição generalizada de Pareto e mistura de distribuições de Gumbel no estudo da vazão e da velocidade máxima do vento em Piracicaba, SP / The generalized Pareto distribution and Gumbel mixture to study flow and maximum wind speed in Piracicaba, SP

Silva, Renato Rodrigues 10 October 2008 (has links)
A teoria dos valores extremos é um tópico da probabilidade que descreve a distribuição assintótica das estatísticas de ordem, tais como máximos ou mínimos, de uma seqüência de variáveis aleatórias que seguem uma função de distribuição F normalmente desconhecida. Descreve, ainda, a distribuição assintótica dos excessos acima de um valor limiar de um ou mais termos dessa seqüência. Dessa forma, as metodologias padrões utilizada neste contexto consistem no ajuste da distribuição generalizada dos valores extremos a uma série de máximos anuais ou no ajuste da distribuição generalizada de Pareto a uma série de dados compostas somente de observações excedentes de um valor limiar. No entanto, segundo Coles et al. (2003), há uma crescente insatisfação com o desempenho destes modelos padrões para predição de eventos extremos causada, possivelmente, por pressuposições não atendidas como a de independência das observações ou pelo fato de que os mesmos não sejam recomendados para serem utilizados em algumas situações específicas como por exemplo e quando observações de máximos anuais compostas por duas ou mais populações independentes de eventos extremos sendo que a primeira descreve eventos menos freqüentes e de maior magnitude e a segunda descreve eventos mais freqüentes e de menor magnitude. Então, os dois artigos que compõem este trabalho tem como objetivo apresentar alternativas de análise de valores extremos para estas situações em que o ajuste dos modelos padrões não são adequados. No primeiro, foram ajustadas as distribuições generalizada de Pareto e exponencial, caso particular da GP, aos dados de vazão média diária do Posto de Artemis, Piracicaba, SP, Brasil, conjuntamente com a técnica do desagrupamento, (declustering), e comparadas as estimativas dos níveis de retorno para períodos de 5, 10, 50 e 100 anos. Conclui-se que as estimativas intervalares dos níveis de retorno obtidas por meio do ajuste da distribuição exponencial são mais precisas do que as obtidas com o ajuste da distribuição generalizada de Pareto. No segundo artigo, por sua vez, foi apresentada uma metodologia para o ajuste da distribuição de Gumbel e de misturas de duas distribuições de Gumbel aos dados de velocidades de ventos mensais de Piracicaba, SP. Selecionou-se a distribuição que melhor ajustou-se aos dados por meio de testes de hipóteses bootstrap paramétrico e critérios de seleção AIC e BIC. E concluiu-se que a mistura de duas distribuições de Gumbel é a distribuição que melhor se ajustou-se aos dados de velocidades máxima de ventos dos meses de abril e maio, enquanto que o ajuste da distribuição de Gumbel foi o melhor para os meses de agosto e setembro. / The extreme value theory is a probability topics that describes the asymtoptic distribution of order statistics such as maximum or minimum of random variables sequence that follow a distribution function F normaly unknown. Describes still, the excess asymtoptic distribution over threshold of this sequence. So, the standard methodologies of extremes values analysis are the fitting of generalized extreme value distribution to yearly maximum series or the fitting of generalized Pareto distribution to partial duration series. However, according to Coles et al. (2003), there is a growing dissatisfaction with the use this standard models for the prediction of extremes events and one of possible causes this fact may be a false assumptions about a sequence of observed data as a independence assumptions or because the standards models must not used in some specific situations like for example when maximum sample arise from two or more independents populations, where the first population describes more frequents and low intense events and the second population describes less frequents and more intense events. In this way, the two articles this work has a objective show alternatives about extreme values analysis for this situations that the standards models doesn´t recommended. In the first article, the generalized distribution Pareto and exponencial distribution, particular case of GP, together with to declustering methods was applied to mean daily flow of the Piracicaba river, Artemis station, Piracicaba, SP, and the estimates the return levels of 5, 10, 50 and 100 years were compared. We conclude that the interval estimates of the 50 and 100 year return levels obtained using the fitting the exponencial distribution are more precise than those obtained using the generalized Pareto distribution. In the second article, we propose the fit of Gumbel distribution and the Gumbel mixture to data maximum speed wind in Piracicaba, SP. We select the best model using bootstrap test of hypotheses and the AIC and BIC selection criteria We conclude that the mixture Gumbel is the best model to analyze the maximum wind speed data for months of april e may and otherside the fit of Gumbel distributions was the best fit to months of august e september.
5

A distribuição generalizada de Pareto e mistura de distribuições de Gumbel no estudo da vazão e da velocidade máxima do vento em Piracicaba, SP / The generalized Pareto distribution and Gumbel mixture to study flow and maximum wind speed in Piracicaba, SP

Renato Rodrigues Silva 10 October 2008 (has links)
A teoria dos valores extremos é um tópico da probabilidade que descreve a distribuição assintótica das estatísticas de ordem, tais como máximos ou mínimos, de uma seqüência de variáveis aleatórias que seguem uma função de distribuição F normalmente desconhecida. Descreve, ainda, a distribuição assintótica dos excessos acima de um valor limiar de um ou mais termos dessa seqüência. Dessa forma, as metodologias padrões utilizada neste contexto consistem no ajuste da distribuição generalizada dos valores extremos a uma série de máximos anuais ou no ajuste da distribuição generalizada de Pareto a uma série de dados compostas somente de observações excedentes de um valor limiar. No entanto, segundo Coles et al. (2003), há uma crescente insatisfação com o desempenho destes modelos padrões para predição de eventos extremos causada, possivelmente, por pressuposições não atendidas como a de independência das observações ou pelo fato de que os mesmos não sejam recomendados para serem utilizados em algumas situações específicas como por exemplo e quando observações de máximos anuais compostas por duas ou mais populações independentes de eventos extremos sendo que a primeira descreve eventos menos freqüentes e de maior magnitude e a segunda descreve eventos mais freqüentes e de menor magnitude. Então, os dois artigos que compõem este trabalho tem como objetivo apresentar alternativas de análise de valores extremos para estas situações em que o ajuste dos modelos padrões não são adequados. No primeiro, foram ajustadas as distribuições generalizada de Pareto e exponencial, caso particular da GP, aos dados de vazão média diária do Posto de Artemis, Piracicaba, SP, Brasil, conjuntamente com a técnica do desagrupamento, (declustering), e comparadas as estimativas dos níveis de retorno para períodos de 5, 10, 50 e 100 anos. Conclui-se que as estimativas intervalares dos níveis de retorno obtidas por meio do ajuste da distribuição exponencial são mais precisas do que as obtidas com o ajuste da distribuição generalizada de Pareto. No segundo artigo, por sua vez, foi apresentada uma metodologia para o ajuste da distribuição de Gumbel e de misturas de duas distribuições de Gumbel aos dados de velocidades de ventos mensais de Piracicaba, SP. Selecionou-se a distribuição que melhor ajustou-se aos dados por meio de testes de hipóteses bootstrap paramétrico e critérios de seleção AIC e BIC. E concluiu-se que a mistura de duas distribuições de Gumbel é a distribuição que melhor se ajustou-se aos dados de velocidades máxima de ventos dos meses de abril e maio, enquanto que o ajuste da distribuição de Gumbel foi o melhor para os meses de agosto e setembro. / The extreme value theory is a probability topics that describes the asymtoptic distribution of order statistics such as maximum or minimum of random variables sequence that follow a distribution function F normaly unknown. Describes still, the excess asymtoptic distribution over threshold of this sequence. So, the standard methodologies of extremes values analysis are the fitting of generalized extreme value distribution to yearly maximum series or the fitting of generalized Pareto distribution to partial duration series. However, according to Coles et al. (2003), there is a growing dissatisfaction with the use this standard models for the prediction of extremes events and one of possible causes this fact may be a false assumptions about a sequence of observed data as a independence assumptions or because the standards models must not used in some specific situations like for example when maximum sample arise from two or more independents populations, where the first population describes more frequents and low intense events and the second population describes less frequents and more intense events. In this way, the two articles this work has a objective show alternatives about extreme values analysis for this situations that the standards models doesn´t recommended. In the first article, the generalized distribution Pareto and exponencial distribution, particular case of GP, together with to declustering methods was applied to mean daily flow of the Piracicaba river, Artemis station, Piracicaba, SP, and the estimates the return levels of 5, 10, 50 and 100 years were compared. We conclude that the interval estimates of the 50 and 100 year return levels obtained using the fitting the exponencial distribution are more precise than those obtained using the generalized Pareto distribution. In the second article, we propose the fit of Gumbel distribution and the Gumbel mixture to data maximum speed wind in Piracicaba, SP. We select the best model using bootstrap test of hypotheses and the AIC and BIC selection criteria We conclude that the mixture Gumbel is the best model to analyze the maximum wind speed data for months of april e may and otherside the fit of Gumbel distributions was the best fit to months of august e september.
6

Robust portfolio optimization with Expected Shortfall / Robust portföljoptimering med ES

Isaksson, Daniel January 2016 (has links)
This thesis project studies robust portfolio optimization with Expected Short-fall applied to a reference portfolio consisting of Swedish linear assets with stocks and a bond index. Specifically, the classical robust optimization definition, focusing on uncertainties in parameters, is extended to also include uncertainties in log-return distribution. My contribution to the robust optimization community is to study portfolio optimization with Expected Shortfall with log-returns modeled by either elliptical distributions or by a normal copula with asymmetric marginal distributions. The robust optimization problem is solved with worst-case parameters from box and ellipsoidal un-certainty sets constructed from historical data and may be used when an investor has a more conservative view on the market than history suggests. With elliptically distributed log-returns, the optimization problem is equivalent to Markowitz mean-variance optimization, connected through the risk aversion coefficient. The results show that the optimal holding vector is almost independent of elliptical distribution used to model log-returns, while Expected Shortfall is strongly dependent on elliptical distribution with higher Expected Shortfall as a result of fatter distribution tails. To model the tails of the log-returns asymmetrically, generalized Pareto distributions are used together with a normal copula to capture multivariate dependence. In this case, the optimization problem is not equivalent to Markowitz mean-variance optimization and the advantages of using Expected Shortfall as risk measure are utilized. With the asymmetric log-return model there is a noticeable difference in optimal holding vector compared to the elliptical distributed model. Furthermore the Expected Shortfall in-creases, which follows from better modeled distribution tails. The general conclusions in this thesis project is that portfolio optimization with Expected Shortfall is an important problem being advantageous over Markowitz mean-variance optimization problem when log-returns are modeled with asymmetric distributions. The major drawback of portfolio optimization with Expected Shortfall is that it is a simulation based optimization problem introducing statistical uncertainty, and if the log-returns are drawn from a copula the simulation process involves more steps which potentially can make the program slower than drawing from an elliptical distribution. Thus, portfolio optimization with Expected Shortfall is appropriate to employ when trades are made on daily basis. / Examensarbetet behandlar robust portföljoptimering med Expected Shortfall tillämpad på en referensportfölj bestående av svenska linjära tillgångar med aktier och ett obligationsindex. Specifikt så utvidgas den klassiska definitionen av robust optimering som fokuserar på parameterosäkerhet till att även inkludera osäkerhet i log-avkastningsfördelning. Mitt bidrag till den robusta optimeringslitteraturen är att studera portföljoptimering med Expected Shortfall med log-avkastningar modellerade med antingen elliptiska fördelningar eller med en norma-copul med asymmetriska marginalfördelningar. Det robusta optimeringsproblemet löses med värsta tänkbara scenario parametrar från box och ellipsoid osäkerhetsset konstruerade från historiska data och kan användas när investeraren har en mer konservativ syn på marknaden än vad den historiska datan föreslår. Med elliptiskt fördelade log-avkastningar är optimeringsproblemet ekvivalent med Markowitz väntevärde-varians optimering, kopplade med riskaversionskoefficienten. Resultaten visar att den optimala viktvektorn är nästan oberoende av vilken elliptisk fördelning som används för att modellera log-avkastningar, medan Expected Shortfall är starkt beroende av elliptisk fördelning med högre Expected Shortfall som resultat av fetare fördelningssvansar. För att modellera svansarna till log-avkastningsfördelningen asymmetriskt används generaliserade Paretofördelningar tillsammans med en normal-copula för att fånga det multivariata beroendet. I det här fallet är optimeringsproblemet inte ekvivalent till Markowitz väntevärde-varians optimering och fördelarna med att använda Expected Shortfall som riskmått används. Med asymmetrisk log-avkastningsmodell uppstår märkbara skillnader i optimala viktvektorn jämfört med elliptiska fördelningsmodeller. Därutöver ökar Expected Shortfall, vilket följer av bättre modellerade fördelningssvansar. De generella slutsatserna i examensarbetet är att portföljoptimering med Expected Shortfall är ett viktigt problem som är fördelaktigt över Markowitz väntevärde-varians optimering när log-avkastningar är modellerade med asymmetriska fördelningar. Den största nackdelen med portföljoptimering med Expected Shortfall är att det är ett simuleringsbaserat optimeringsproblem som introducerar statistisk osäkerhet, och om log-avkastningar dras från en copula så involverar simuleringsprocessen flera steg som potentiellt kan göra programmet långsammare än att dra från en elliptisk fördelning. Därför är portföljoptimering med Expected Shortfall lämpligt att använda när handel sker på daglig basis.
7

Applying Peaks-Over-Threshold for Increasing the Speed of Convergence of a Monte Carlo Simulation / Peaks-Over-Threshold tillämpat på en Monte Carlo simulering för ökad konvergenshastighet

Jakobsson, Eric, Åhlgren, Thor January 2022 (has links)
This thesis investigates applying the semiparametric method Peaks-Over-Threshold on data generated from a Monte Carlo simulation when estimating the financial risk measures Value-at-Risk and Expected Shortfall. The goal is to achieve a faster convergence than a Monte Carlo simulation when assessing extreme events that symbolise the worst outcomes of a financial portfolio. Achieving a faster convergence will enable a reduction of iterations in the Monte Carlo simulation, thus enabling a more efficient way of estimating risk measures for the portfolio manager.  The financial portfolio consists of US life insurance policies offered on the secondary market, gathered by our partner RessCapital. The method is evaluated on three different portfolios with different defining characteristics.  In Part I an analysis of selecting an optimal threshold is made. The accuracy and precision of Peaks-Over-Threshold is compared to the Monte Carlo simulation with 10,000 iterations, using a simulation of 100,000 iterations as the reference value. Depending on the risk measure and the percentile of interest, different optimal thresholds are selected.  Part II presents the result with the optimal thresholds from Part I. One can conclude that Peaks-Over-Threshold performed significantly better than a Monte Carlo simulation for Value-at-Risk with 10,000 iterations. The results for Expected Shortfall did not achieve a clear improvement in terms of precision, but it did show improvement in terms of accuracy.  Value-at-Risk and Expected Shortfall at the 99.5th percentile achieved a greater error reduction than at the 99th. The result therefore aligned well with theory, as the more "rare" event considered, the better the Peaks-Over-Threshold method performed.  In conclusion, the method of applying Peaks-Over-Threshold can be proven useful when looking to reduce the number of iterations since it do increase the convergence of a Monte Carlo simulation. The result is however dependent on the rarity of the event of interest, and the level of precision/accuracy required. / Det här examensarbetet tillämpar metoden Peaks-Over-Threshold på data genererat från en Monte Carlo simulering för att estimera de finansiella riskmåtten Value-at-Risk och Expected Shortfall. Målet med arbetet är att uppnå en snabbare konvergens jämfört med en Monte Carlo simulering när intresset är s.k. extrema händelser som symboliserar de värsta utfallen för en finansiell portfölj. Uppnås en snabbare konvergens kan antalet iterationer i simuleringen minskas, vilket möjliggör ett mer effektivt sätt att estimera riskmåtten för portföljförvaltaren.  Den finansiella portföljen består av amerikanska livförsäkringskontrakt som har erbjudits på andrahandsmarknaden, insamlat av vår partner RessCapital. Metoden utvärderas på tre olika portföljer med olika karaktär.  I Del I så utförs en analys för att välja en optimal tröskel för Peaks-Over-Threshold. Noggrannheten och precisionen för Peaks-Over-Threshold jämförs med en Monte Carlo simulering med 10,000 iterationer, där en Monte Carlo simulering med 100,000 iterationer används som referensvärde. Beroende på riskmått samt vilken percentil som är av intresse så väljs olika trösklar.  I Del II presenteras resultaten med de "optimalt" valda trösklarna från Del I. Peaks-over-Threshold påvisade signifikant bättre resultat för Value-at-Risk jämfört med Monte Carlo simuleringen med 10,000 iterationer. Resultaten för Expected Shortfall påvisade inte en signifikant förbättring sett till precision, men visade förbättring sett till noggrannhet.  För både Value-at-Risk och Expected Shortfall uppnådde Peaks-Over-Threshold en större felminskning vid 99.5:e percentilen jämfört med den 99:e. Resultaten var därför i linje med de teoretiska förväntningarna då en högre percentil motsvarar ett extremare event.  Sammanfattningsvis så kan metoden Peaks-Over-Threshold vara användbar när det kommer till att minska antalet iterationer i en Monte Carlo simulering då resultatet visade att Peaks-Over-Threshold appliceringen accelererar Monte Carlon simuleringens konvergens. Resultatet är dock starkt beroende av det undersökta eventets sannolikhet, samt precision- och noggrannhetskravet.
8

Bayesian Modeling of Sub-Asymptotic Spatial Extremes

Yadav, Rishikesh 04 1900 (has links)
In many environmental and climate applications, extreme data are spatial by nature, and hence statistics of spatial extremes is currently an important and active area of research dedicated to developing innovative and flexible statistical models that determine the location, intensity, and magnitude of extreme events. In particular, the development of flexible sub-asymptotic models is in trend due to their flexibility in modeling spatial high threshold exceedances in larger spatial dimensions and with little or no effects on the choice of threshold, which is complicated with classical extreme value processes, such as Pareto processes. In this thesis, we develop new flexible sub-asymptotic extreme value models for modeling spatial and spatio-temporal extremes that are combined with carefully designed gradient-based Markov chain Monte Carlo (MCMC) sampling schemes and that can be exploited to address important scientific questions related to risk assessment in a wide range of environmental applications. The methodological developments are centered around two distinct themes, namely (i) sub-asymptotic Bayesian models for extremes; and (ii) flexible marked point process models with sub-asymptotic marks. In the first part, we develop several types of new flexible models for light-tailed and heavy-tailed data, which extend a hierarchical representation of the classical generalized Pareto (GP) limit for threshold exceedances. Spatial dependence is modeled through latent processes. We study the theoretical properties of our new methodology and demonstrate it by simulation and applications to precipitation extremes in both Germany and Spain. In the second part, we construct new marked point process models, where interest mostly lies in the extremes of the mark distribution. Our proposed joint models exploit intrinsic CAR priors to capture the spatial effects in landslide counts and sizes, while the mark distribution is assumed to take various parametric forms. We demonstrate that having a sub-asymptotic distribution for landslide sizes provides extra flexibility to accurately capture small to large and especially extreme, devastating landslides.
9

Velké odchylky a jejich aplikace v pojistné matematice / Large deviations and their applications in insurance mathematics

Fuchsová, Lucia January 2011 (has links)
Title: Large deviations and their applications in insurance mathematics Author: Lucia Fuchsová Department: Department of Probability and Mathematical Statistics Supervisor: RNDr. Zbyněk Pawlas, Ph.D. Supervisor's e-mail address: Zbynek.Pawlas@mff.cuni.cz Abstract: In the present work we study large deviations theory. We discuss heavy-tailed distributions, which describe the probability of large claim oc- curence. We are interested in the use of large deviations theory in insurance. We simulate claim sizes and their arrival times for Cramér-Lundberg model and first we analyze the probability that ruin happens in dependence on the parameters of our model for Pareto distributed claim size, next we compare ruin probability for other claim size distributions. For real life data we model the probability of large claim size occurence by generalized Pareto distribu- tion. 1
10

Metody modelování a statistické analýzy procesu extremálních hodnot / Methods of modelling and statistical analysis of an extremal value process

Jelenová, Klára January 2012 (has links)
In the present work we deal with the problem of etremal value of time series, especially of maxima. We study times and values of maximum by an approach of point process and we model distribution of extremal values by statistical methods. We estimate parameters of distribution using different methods, namely graphical methods of data analysis and subsequently we test the estimated distribution by tests of goodness of fit. We study the stationary case and also the cases with a trend. In connection with distribution of excesess and exceedances over a threshold we deal with generalized Pareto distribution.

Page generated in 0.0464 seconds