• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 6
  • 6
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 53
  • 53
  • 25
  • 21
  • 19
  • 12
  • 12
  • 10
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Incorporating geologic information into hydraulic tomography: A general framework based on geostatistical approach

Zha, Yuanyuan, Yeh, Tian-Chyi J., Illman, Walter A., Onoe, Hironori, Mok, Chin Man W., Wen, Jet-Chau, Huang, Shao-Yang, Wang, Wenke 04 1900 (has links)
Hydraulic tomography (HT) has become a mature aquifer test technology over the last two decades. It collects nonredundant information of aquifer heterogeneity by sequentially stressing the aquifer at different wells and collecting aquifer responses at other wells during each stress. The collected information is then interpreted by inverse models. Among these models, the geostatistical approaches, built upon the Bayesian framework, first conceptualize hydraulic properties to be estimated as random fields, which are characterized by means and covariance functions. They then use the spatial statistics as prior information with the aquifer response data to estimate the spatial distribution of the hydraulic properties at a site. Since the spatial statistics describe the generic spatial structures of the geologic media at the site rather than site-specific ones (e. g., known spatial distributions of facies, faults, or paleochannels), the estimates are often not optimal. To improve the estimates, we introduce a general statistical framework, which allows the inclusion of site-specific spatial patterns of geologic features. Subsequently, we test this approach with synthetic numerical experiments. Results show that this approach, using conditional mean and covariance that reflect site-specific large-scale geologic features, indeed improves the HT estimates. Afterward, this approach is applied to HT surveys at a kilometerscale- fractured granite field site with a distinct fault zone. We find that by including fault information from outcrops and boreholes for HT analysis, the estimated hydraulic properties are improved. The improved estimates subsequently lead to better prediction of flow during a different pumping test at the site.
2

Fire in the southern U.S: administrative laws and regulations in the Southeast and wildfire distribution in Mississippi

Tolver, Branden 07 August 2010 (has links)
Wildfires in the United States present a complexity of problems for private landowners and policy makers. This thesis takes a look at two key issues faced by private and government stakeholders; the first being a lack of knowledge regarding current prescribed fire laws and regulations. A legal review of administrative laws and regulations for prescribed burning in the Southeastern United States in the context of management-based regulation is used to address this issue. It was found that regulation for prescribed burning has shifted to a more management–based regime. The second is an empirical study of wildfire distribution in the state of Mississippi. Wildfires appear to fit a Pareto distribution throughout the state given a certain threshold. When analyzed in conjunction both studies could aid lawmakers in projecting the effects of a given policy change on actual wildfire occurrences and distribution.
3

Εκτίμηση για την κατανομή Pareto

Αγγέλου, Γρηγορία 06 November 2014 (has links)
Η παρούσα μεταπτυχιακή διατριβή διαπραγματεύεται τη μελέτη της κατανομής Pareto, την εκτίμηση και την σύγκριση των εκτιμητών των παραμέτρων της καθώς και την εκτίμηση της συνάρτησης επιβίωσης της δεδομένου ότι η κατανομή Pareto χρησιμοποιείται ως μοντέλο για την εκτίμηση μεγάλων εισοδημάτων. Στο Κεφάλαιο 1, παραθέτουμε μερικούς βασικούς ορισμούς και θεωρήματα της Μαθηματικής Στατιστικής όπου είναι αναγκαία για την ανάπτυξη της εργασίας μας. Στο Κεφάλαιο 2, αναφερόμαστε στη κατανομή Pareto, στα γενικά χαρακτηριστικά της και τη συσχέτισή της με άλλες γνωστές κατανομές. Στο Κεφάλαιο 3, μελετάμε τους εκτιμητές των παραμέτρων της κατανομή Pareto ως προς το τετραγωνικό σφάλμα κάνοντας και κάποιες συγκρίσεις μεταξύ των εκτιμητών. Στο Κεφάλαιο 4, μελετάμε τους εκτιμητές Bayes των παραμέτρων της κατανομή Pareto με συνάρτηση σφάλματος LINEX και τους συγκρίνουμε με τους εκτιμητές Bayes με τετραγωνικό σφάλμα. Στο Κεφάλαιο 5, εκτιμάμε της συνάρτηση επιβίωσης και μελετάμε τους αμερόληπτους εκτιμητές ελάχιστης διασποράς της πυκνότητας πιθανότητας και της συνάρτησης κατανομής συγκρινόντας τους, στη συνέχεια, με τους αντίστοιχους εκτιμητές μέγιστης πιθανοφάνειας. Στο Κεφάλαιο 6, παρουσιάζουμε ένα παράδειγμα για την καλύτερη κατανόηση των εκτιμήσεων μας. / We make an estimation for the Pareto distribution, we estimate the parameters of it and we make comparisons with each other.
4

On the Invariance of Size Distribution of Establishments

Kamanina, Polina January 2012 (has links)
The thesis examines the establishment size distribution over time and across groups of regions, using data on Swedish establishments during period 1994-2009. The size distribution of establishments is highly skewed and approximates the Pareto distribution. The shape of size distribution is invariant over time and across groups of regions. The distribution of total number of establishments and incumbent distribution are found to rise from the same distribution. Moreover, the invariance of establishment size distribution is highly determined by the invariance of distribution of incumbents, entry and exit distributions. Larger establishments have more chances to survive and higher probability to remain in current size group comparing to smaller ones, whereas higher probabilities of growth would be attached to smaller establishments.
5

Parameter Estimation for Generalized Pareto Distribution

Lin, Der-Chen 01 May 1988 (has links)
The generalized Pareto distribution was introduced by Pickands (1975). Three methods of estimating the parameters of the generalized Pareto distribution were compared by Hosking and Wallis (1987) . The methods are maximum likelihood, method of moments and probability-weighted moments. An alternate method of estimation for the generalized Pareto distribution, based on least square regression of expected order statistics (REOS), is developed and evaluated in this thesis . A Monte Carlo comparison is made between this method and the estimating methods considered by Hosking and Wallis (1987). This method is shown to be generally superior to the maximum likelihood, method of moments and probability-weighted moments
6

A comprehensive analysis of extreme rainfall

Kagoda, Paulo Abuneeri 13 August 2008 (has links)
No description available.
7

An empirical comparison of extreme value modelling procedures for the estimation of high quantiles

Engberg, Alexander January 2016 (has links)
The peaks over threshold (POT) method provides an attractive framework for estimating the risk of extreme events such as severe storms or large insurance claims. However, the conventional POT procedure, where the threshold excesses are modelled by a generalized Pareto distribution, suffers from small samples and subjective threshold selection. In recent years, two alternative approaches have been proposed in the form of mixture models that estimate the threshold and a folding procedure that generates larger tail samples. In this paper the empirical performances of the conventional POT procedure, the folding procedure and a mixture model are compared by modelling data sets on fire insurance claims and hurricane damage costs. The results show that the folding procedure gives smaller standard errors of the parameter estimates and in some cases more stable quantile estimates than the conventional POT procedure. The mixture model estimates are dependent on the starting values in the numerical maximum likelihood estimation, and are therefore difficult to compare with those from the other procedures. The conclusion is that none of the procedures is overall better than the others but that there are situations where one method may be preferred.
8

Modelování velkých škod / Modelování velkých škod

Zuzáková, Barbora January 2013 (has links)
Title: Large claims modeling Author: Barbora Zuzáková Department: Department of Probability and Mathematical Statistics Supervisor: RNDr. Michal Pešta, Ph.D. Abstract: This thesis discusses a statistical modeling approach based on the extreme value theory to describe the behaviour of large claims of an insurance portfolio. We focus on threshold models which analyze exceedances of a high threshold. This approach has gained in popularity in recent years, as compared with the much older methods based directly on the extreme value distributions. The method is illustated using the group medical claims database recorded over the periods 1997, 1998 and 1999 maintained by the Society of Actuaries. We aim to demonstrate that the proposed model outperforms classical parametric distri- butions and thus enables to estimate high quantiles or the probable maximum loss more precisely. Keywords: threshold models, generalized Pareto distribution, large claims. 1
9

Statistical analysis of pyrosequence data

Keating, Karen January 1900 (has links)
Doctor of Philosophy / Department of Statistics / Gary L. Gadbury / Since their commercial introduction in 2005, DNA sequencing technologies have become widely available and are now cost-effective tools for determining the genetic characteristics of organisms. While the biomedical applications of DNA sequencing are apparent, these technologies have been applied to many other research areas. One such area is community ecology, in which DNA sequence data are used to identify the presence and abundance of microscopic organisms that inhabit an environment. This is currently an active area of research, since it is generally believed that a change in the composition of microscopic species in a geographic area may signal a change in the overall health of the environment. An overview of DNA pyrosequencing, as implemented by the Roche/Life Science 454 platform, is presented and aspects of the process that can introduce variability in data are identified. Four ecological data sets that were generated by the 454 platform are used for illustration. Characteristics of these data include high dimensionality, a large proportion of zeros (usually in excess of 90%), and nonzero values that are strongly right-skewed. A nonparametric method to standardize these data is presented and effects of standardization on outliers and skewness are examined. Traditional statistical methods for analyzing macroscopic species abundance data are discussed, and the applicability of these methods to microscopic species data is examined. One objective that receives focus is the classification of microscopic species as either rare or common species. This is an important distinction since there is much evidence to suggest that the biological and environmental mechanisms that govern common species are distinctly different than the mechanisms that govern rare species. This indicates that the abundance patterns for common and rare species may follow different probability models, and the suitability of the Pareto distribution for rare species is examined. Techniques for classifying macroscopic species are shown to be ill-suited for microscopic species, and an alternative technique is presented. Recognizing that the structure of the data is similar to that of financial applications (such as insurance claims and the distribution of wealth), the Gini index and other statistics based on the Lorenz curve are explored as potential test statistics for distinguishing rare versus common species.
10

A distribuição generalizada de Pareto e mistura de distribuições de Gumbel no estudo da vazão e da velocidade máxima do vento em Piracicaba, SP / The generalized Pareto distribution and Gumbel mixture to study flow and maximum wind speed in Piracicaba, SP

Silva, Renato Rodrigues 10 October 2008 (has links)
A teoria dos valores extremos é um tópico da probabilidade que descreve a distribuição assintótica das estatísticas de ordem, tais como máximos ou mínimos, de uma seqüência de variáveis aleatórias que seguem uma função de distribuição F normalmente desconhecida. Descreve, ainda, a distribuição assintótica dos excessos acima de um valor limiar de um ou mais termos dessa seqüência. Dessa forma, as metodologias padrões utilizada neste contexto consistem no ajuste da distribuição generalizada dos valores extremos a uma série de máximos anuais ou no ajuste da distribuição generalizada de Pareto a uma série de dados compostas somente de observações excedentes de um valor limiar. No entanto, segundo Coles et al. (2003), há uma crescente insatisfação com o desempenho destes modelos padrões para predição de eventos extremos causada, possivelmente, por pressuposições não atendidas como a de independência das observações ou pelo fato de que os mesmos não sejam recomendados para serem utilizados em algumas situações específicas como por exemplo e quando observações de máximos anuais compostas por duas ou mais populações independentes de eventos extremos sendo que a primeira descreve eventos menos freqüentes e de maior magnitude e a segunda descreve eventos mais freqüentes e de menor magnitude. Então, os dois artigos que compõem este trabalho tem como objetivo apresentar alternativas de análise de valores extremos para estas situações em que o ajuste dos modelos padrões não são adequados. No primeiro, foram ajustadas as distribuições generalizada de Pareto e exponencial, caso particular da GP, aos dados de vazão média diária do Posto de Artemis, Piracicaba, SP, Brasil, conjuntamente com a técnica do desagrupamento, (declustering), e comparadas as estimativas dos níveis de retorno para períodos de 5, 10, 50 e 100 anos. Conclui-se que as estimativas intervalares dos níveis de retorno obtidas por meio do ajuste da distribuição exponencial são mais precisas do que as obtidas com o ajuste da distribuição generalizada de Pareto. No segundo artigo, por sua vez, foi apresentada uma metodologia para o ajuste da distribuição de Gumbel e de misturas de duas distribuições de Gumbel aos dados de velocidades de ventos mensais de Piracicaba, SP. Selecionou-se a distribuição que melhor ajustou-se aos dados por meio de testes de hipóteses bootstrap paramétrico e critérios de seleção AIC e BIC. E concluiu-se que a mistura de duas distribuições de Gumbel é a distribuição que melhor se ajustou-se aos dados de velocidades máxima de ventos dos meses de abril e maio, enquanto que o ajuste da distribuição de Gumbel foi o melhor para os meses de agosto e setembro. / The extreme value theory is a probability topics that describes the asymtoptic distribution of order statistics such as maximum or minimum of random variables sequence that follow a distribution function F normaly unknown. Describes still, the excess asymtoptic distribution over threshold of this sequence. So, the standard methodologies of extremes values analysis are the fitting of generalized extreme value distribution to yearly maximum series or the fitting of generalized Pareto distribution to partial duration series. However, according to Coles et al. (2003), there is a growing dissatisfaction with the use this standard models for the prediction of extremes events and one of possible causes this fact may be a false assumptions about a sequence of observed data as a independence assumptions or because the standards models must not used in some specific situations like for example when maximum sample arise from two or more independents populations, where the first population describes more frequents and low intense events and the second population describes less frequents and more intense events. In this way, the two articles this work has a objective show alternatives about extreme values analysis for this situations that the standards models doesn´t recommended. In the first article, the generalized distribution Pareto and exponencial distribution, particular case of GP, together with to declustering methods was applied to mean daily flow of the Piracicaba river, Artemis station, Piracicaba, SP, and the estimates the return levels of 5, 10, 50 and 100 years were compared. We conclude that the interval estimates of the 50 and 100 year return levels obtained using the fitting the exponencial distribution are more precise than those obtained using the generalized Pareto distribution. In the second article, we propose the fit of Gumbel distribution and the Gumbel mixture to data maximum speed wind in Piracicaba, SP. We select the best model using bootstrap test of hypotheses and the AIC and BIC selection criteria We conclude that the mixture Gumbel is the best model to analyze the maximum wind speed data for months of april e may and otherside the fit of Gumbel distributions was the best fit to months of august e september.

Page generated in 0.1253 seconds