• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 139
  • 27
  • 19
  • 13
  • 11
  • 9
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 263
  • 263
  • 175
  • 68
  • 61
  • 51
  • 40
  • 34
  • 31
  • 30
  • 28
  • 25
  • 25
  • 23
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On some issues in the modelling of extreme observations

Wong, Siu-tung., 王兆東. January 2009 (has links)
published_or_final_version / Statistics and Actuarial Science / Doctoral / Doctor of Philosophy
2

Actuarial modelling of extremal events using transformed generalized extreme value distributions and generalized pareto distributions

Han, Zhongxian, January 2003 (has links)
Thesis (Ph. D.)--Ohio State University, 2003. / Title from first page of PDF file. Document formatted into pages; contains x, 81 p.; also includes graphics (some col.). Includes abstract and vita. Advisor: Bostwick Wyman, Dept. of Mathematics. Includes bibliographical references (p. 80-81).
3

Extreme value index estimation with applications to modeling extreme insurance losses and sea surface temperatures /

Henry, John B. January 1900 (has links)
Thesis (Ph. D.)--Oregon State University, 2009. / Printout. Includes bibliographical references (leaves 80-84). Also available on the World Wide Web.
4

On some issues in the modelling of extreme observations

Wong, Siu-tung. January 2009 (has links)
Thesis (Ph. D.)--University of Hong Kong, 2009. / Includes bibliographical references (leaves 60-169) Also available in print.
5

On optimal allocation problem in multi-group extreme value regression under censoring.

January 2006 (has links)
Ka Cheuk Yin Timothy. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2006. / Includes bibliographical references (leaves 52-54). / Abstracts in English and Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Stress Test --- p.1 / Chapter 1.2 --- Extreme Value Regression --- p.2 / Chapter 1.3 --- Type II Censoring --- p.4 / Chapter 1.4 --- Test Plan --- p.5 / Chapter 1.5 --- The Scope of the Thesis --- p.6 / Chapter 2 --- Extreme Value Regression Model --- p.7 / Chapter 2.1 --- Introduction --- p.7 / Chapter 2.2 --- Maximum Likelihood Estimation --- p.8 / Chapter 2.3 --- Variance-Covariance Matrix --- p.9 / Chapter 3 --- Optimality Criteria and Allocation Methods --- p.15 / Chapter 3.1 --- Introduction --- p.15 / Chapter 3.2 --- Optimality Criteria --- p.16 / Chapter 3.3 --- Allocation Methods --- p.17 / Chapter 4 --- Asymptotic Results --- p.21 / Chapter 4.1 --- Introduction --- p.21 / Chapter 4.2 --- Asymptotic Variance-Covariance Matrix --- p.22 / Chapter 4.3 --- Optimality Criteria --- p.29 / Chapter 5 --- Optimal Allocations --- p.32 / Chapter 5.1 --- Introduction --- p.32 / Chapter 5.2 --- Allocation for small sample size --- p.33 / Chapter 5.2.1 --- 2-stress-level case --- p.33 / Chapter 5.2.2 --- 4-stress-level case --- p.34 / Chapter 5.2.3 --- Suggested Optimal Allocation --- p.39 / Chapter 5.2.4 --- Comparison with the complete sample case --- p.43 / Chapter 5.3 --- Asymptotic Allocations --- p.44 / Chapter 6 --- Conclusions and Further Research --- p.50 / Bibliography --- p.52
6

The GARCH-EVT-Copula model and simulation in scenario-based asset allocation

McEwan, Peter Gareth Fredric January 2016 (has links)
Financial market integration, in particular, portfolio allocations from advanced economies to South African markets, continues to strengthen volatility linkages and quicken volatility transmissions between participating markets. Largely as a result, South African portfolios are net recipients of returns and volatility shocks emanating from major world markets. In light of these, and other, sources of risk, this dissertation proposes a methodology to improve risk management systems in funds by building a contemporary asset allocation framework that offers practitioners an opportunity to explicitly model combinations of hypothesised global risks and the effects on their investments. The framework models portfolio return variables and their key risk driver variables separately and then joins them to model their combined dependence structure. The separate modelling of univariate and multivariate (MV) components admits the benefit of capturing the data generating processes with improved accuracy. Univariate variables were modelled using ARMA-GARCH-family structures paired with a variety of skewed and leptokurtic conditional distributions. Model residuals were fit using the Peaks-over-Threshold method from Extreme Value Theory for the tails and a non-parametric, kernel density for the interior, forming a completed semi-parametric distribution (SPD) for each variable. Asset and risk factor returns were then combined and their dependence structure jointly modelled with a MV Student t copula. Finally, the SPD margins and Student t copula were used to construct a MV meta t distribution. Monte Carlo simulations were generated from the fitted MV meta t distribution on which an out-of-sample test was conducted. The 2014-to-2015 horizon served to proxy as an out-of-sample, forward-looking scenario for a set of key risk factors against which a hypothetical, diversified portfolio was optimised. Traditional mean-variance and contemporary mean-CVaR optimisation techniques were used and their results compared. As an addendum, performance over the in-sample 2008 financial crisis was reported. The final Objective (7) addressed management and conservation strategies for the NMBM. The NMBM wetland database that was produced during this research is currently being used by the Municipality and will be added to the latest National Wetland Map. From the database, and tools developed in this research, approximately 90 wetlands have been identified as being highly vulnerable due to anthropogenic and environmental factors (Chapter 6) and should be earmarked as key conservation priority areas. Based on field experience and data collected, this study has also made conservation and rehabilitation recommendations for eight locations. Recommendations are also provided for six more wetland systems (or regions) that should be prioritised for further research, as these systems lack fundamental information on where the threat of anthropogenic activities affecting them is greatest. This study has made a significant contribution to understanding the underlying geomorphological processes in depressions, seeps and wetland flats. The desktop mapping component of this study illustrated the dominance of wetlands in the wetter parts of the Municipality. Perched wetland systems were identified in the field, on shallow bedrock, calcrete or clay. The prevalence of these perches in depressions, seeps and wetland flats also highlighted the importance of rainfall in driving wetland formation, by allowing water to pool on these perches, in the NMBM. These perches are likely to be a key factor in the high number of small, ephemeral wetlands that were observed in the study area, compared to other semi-arid regions. Therefore, this research highlights the value of multi-faceted and multi-scalar wetland research and how similar approaches should be used in future research methods has been highlighted. The approach used, along with the tools/methods developed in this study have facilitated the establishment of priority areas for conservation and management within the NMBM. Furthermore, the research approach has revealed emergent wetland properties that are only apparent when looking at different spatial scales. This research has highlighted the complex biological and geomorphological interactions between wetlands that operate over various spatial and temporal scales. As such, wetland management should occur across a wetland complex, rather than individual sites, to account for these multi-scalar influences.
7

Incorporating geologic information into hydraulic tomography: A general framework based on geostatistical approach

Zha, Yuanyuan, Yeh, Tian-Chyi J., Illman, Walter A., Onoe, Hironori, Mok, Chin Man W., Wen, Jet-Chau, Huang, Shao-Yang, Wang, Wenke 04 1900 (has links)
Hydraulic tomography (HT) has become a mature aquifer test technology over the last two decades. It collects nonredundant information of aquifer heterogeneity by sequentially stressing the aquifer at different wells and collecting aquifer responses at other wells during each stress. The collected information is then interpreted by inverse models. Among these models, the geostatistical approaches, built upon the Bayesian framework, first conceptualize hydraulic properties to be estimated as random fields, which are characterized by means and covariance functions. They then use the spatial statistics as prior information with the aquifer response data to estimate the spatial distribution of the hydraulic properties at a site. Since the spatial statistics describe the generic spatial structures of the geologic media at the site rather than site-specific ones (e. g., known spatial distributions of facies, faults, or paleochannels), the estimates are often not optimal. To improve the estimates, we introduce a general statistical framework, which allows the inclusion of site-specific spatial patterns of geologic features. Subsequently, we test this approach with synthetic numerical experiments. Results show that this approach, using conditional mean and covariance that reflect site-specific large-scale geologic features, indeed improves the HT estimates. Afterward, this approach is applied to HT surveys at a kilometerscale- fractured granite field site with a distinct fault zone. We find that by including fault information from outcrops and boreholes for HT analysis, the estimated hydraulic properties are improved. The improved estimates subsequently lead to better prediction of flow during a different pumping test at the site.
8

Statistical analysis of type-II progressively hybrid censored samples and adaptive type-II progressively hybrid censored samples from extreme value distribution.

January 2009 (has links)
Mak, Man Yung. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2009. / Includes bibliographical references (leaves 115-117). / Abstracts in English and Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Introduction --- p.1 / Chapter 1.2 --- Conventional Censoring Schemes --- p.2 / Chapter 1.3 --- Type-II Progressively Hybrid Censoring Scheme --- p.4 / Chapter 1.4 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.6 / Chapter 1.5 --- Extreme Value Distribution --- p.8 / Chapter 1.6 --- The Scope of the Thesis --- p.11 / Chapter 2 --- Estimation methods --- p.12 / Chapter 2.1 --- Introduction --- p.12 / Chapter 2.2 --- Maximum Likelihood Estimators --- p.13 / Chapter 2.2.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.13 / Chapter 2.2.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.15 / Chapter 2.3 --- Approximate Maximum Likelihood Estimators --- p.18 / Chapter 2.3.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.18 / Chapter 2.3.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.20 / Chapter 2.4 --- Monte Carlo Simulation and Result --- p.23 / Chapter 2.4.1 --- Numerical Comparisons --- p.33 / Chapter 3 --- Construction of Confidence Intervals --- p.35 / Chapter 3.1 --- Introduction --- p.35 / Chapter 3.2 --- Asymptotic Confidence Interval --- p.36 / Chapter 3.2.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.37 / Chapter 3.2.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.39 / Chapter 3.3 --- Parametric Percentile Bootstrap Confidence Interval --- p.56 / Chapter 3.3.1 --- Parametric Percentile Bootstrap Confidence Interval based on Maximum Likelihood Estimation method --- p.57 / Chapter 3.3.2 --- Parametric Percentile Bootstrap Confidence Interval based on Approximate Maximum Likelihood Estimation method --- p.65 / Chapter 3.4 --- Parametric Bootstrap-t Confidence Interval --- p.71 / Chapter 3.4.1 --- Parametric Bootstrap-t Confidence Interval based on Maximum Likelihood Estimation method --- p.72 / Chapter 3.4.2 --- Parametric Bootstrap-t Confidence Interval based on Approxi mate Maximum Likelihood Estimation method --- p.79 / Chapter 3.5 --- Numerical Comparisons --- p.86 / Chapter 4 --- Expected Total Test Time --- p.88 / Chapter 4.1 --- Introduction --- p.88 / Chapter 4.2 --- Type-II Progressively Hybrid Censoring Scheme --- p.89 / Chapter 4.3 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.92 / Chapter 4.4 --- Numerical Comparisons --- p.99 / Chapter 5 --- Optimality Criteria and Censoring Schemes --- p.100 / Chapter 5.1 --- Introduction --- p.100 / Chapter 5.2 --- Optimality Criteria --- p.101 / Chapter 5.3 --- Expected Fisher Information Matrix --- p.102 / Chapter 5.3.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.103 / Chapter 5.4 --- Optimal Censoring Scheme for Progressively Hybrid Censoring --- p.106 / Chapter 6 --- Conclusions and Further Research --- p.113 / Bibliography --- p.115
9

Thresholds for peak-over-threshold theory

Amankonah, Frank O. January 2005 (has links)
Thesis (M.S.)--University of Nevada, Reno, 2005. / "August, 2005." Includes bibliographical references (leaf 43). Online version available on the World Wide Web.
10

Copula Based Hierarchical Bayesian Models

Ghosh, Souparno 2009 August 1900 (has links)
The main objective of our study is to employ copula methodology to develop Bayesian hierarchical models to study the dependencies exhibited by temporal, spatial and spatio-temporal processes. We develop hierarchical models for both discrete and continuous outcomes. In doing so we expect to address the dearth of copula based Bayesian hierarchical models to study hydro-meteorological events and other physical processes yielding discrete responses. First, we present Bayesian methods of analysis for longitudinal binary outcomes using Generalized Linear Mixed models (GLMM). We allow flexible marginal association among the repeated outcomes from different time-points. An unique property of this copula-based GLMM is that if the marginal link function is integrated over the distribution of the random effects, its form remains same as that of the conditional link function. This unique property enables us to retain the physical interpretation of the fixed effects under conditional and marginal model and yield proper posterior distribution. We illustrate the performance of the posited model using real life AIDS data and demonstrate its superiority over the traditional Gaussian random effects model. We develop a semiparametric extension of our GLMM and re-analyze the data from the AIDS study. Next, we propose a general class of models to handle non-Gaussian spatial data. The proposed model can deal with geostatistical data that can accommodate skewness, tail-heaviness, multimodality. We fix the distribution of the marginal processes and induce dependence via copulas. We illustrate the superior predictive performance of our approach in modeling precipitation data as compared to other kriging variants. Thereafter, we employ mixture kernels as the copula function to accommodate non-stationary data. We demonstrate the adequacy of this non-stationary model by analyzing permeability data. In both cases we perform extensive simulation studies to investigate the performances of the posited models under misspecification. Finally, we take up the important problem of modeling multivariate extreme values with copulas. We describe, in detail, how dependences can be induced in the block maxima approach and peak over threshold approach by an extreme value copula. We prove the ability of the posited model to handle both strong and weak extremal dependence and derive the conditions for posterior propriety. We analyze the extreme precipitation events in the continental United States for the past 98 years and come up with a suite of predictive maps.

Page generated in 0.0505 seconds