Spelling suggestions: "subject:"value theory"" "subject:"alue theory""
1 |
On some issues in the modelling of extreme observationsWong, Siu-tung., 王兆東. January 2009 (has links)
published_or_final_version / Statistics and Actuarial Science / Doctoral / Doctor of Philosophy
|
2 |
Actuarial modelling of extremal events using transformed generalized extreme value distributions and generalized pareto distributionsHan, Zhongxian, January 2003 (has links)
Thesis (Ph. D.)--Ohio State University, 2003. / Title from first page of PDF file. Document formatted into pages; contains x, 81 p.; also includes graphics (some col.). Includes abstract and vita. Advisor: Bostwick Wyman, Dept. of Mathematics. Includes bibliographical references (p. 80-81).
|
3 |
Extreme value index estimation with applications to modeling extreme insurance losses and sea surface temperatures /Henry, John B. January 1900 (has links)
Thesis (Ph. D.)--Oregon State University, 2009. / Printout. Includes bibliographical references (leaves 80-84). Also available on the World Wide Web.
|
4 |
On some issues in the modelling of extreme observationsWong, Siu-tung. January 2009 (has links)
Thesis (Ph. D.)--University of Hong Kong, 2009. / Includes bibliographical references (leaves 60-169) Also available in print.
|
5 |
On optimal allocation problem in multi-group extreme value regression under censoring.January 2006 (has links)
Ka Cheuk Yin Timothy. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2006. / Includes bibliographical references (leaves 52-54). / Abstracts in English and Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Stress Test --- p.1 / Chapter 1.2 --- Extreme Value Regression --- p.2 / Chapter 1.3 --- Type II Censoring --- p.4 / Chapter 1.4 --- Test Plan --- p.5 / Chapter 1.5 --- The Scope of the Thesis --- p.6 / Chapter 2 --- Extreme Value Regression Model --- p.7 / Chapter 2.1 --- Introduction --- p.7 / Chapter 2.2 --- Maximum Likelihood Estimation --- p.8 / Chapter 2.3 --- Variance-Covariance Matrix --- p.9 / Chapter 3 --- Optimality Criteria and Allocation Methods --- p.15 / Chapter 3.1 --- Introduction --- p.15 / Chapter 3.2 --- Optimality Criteria --- p.16 / Chapter 3.3 --- Allocation Methods --- p.17 / Chapter 4 --- Asymptotic Results --- p.21 / Chapter 4.1 --- Introduction --- p.21 / Chapter 4.2 --- Asymptotic Variance-Covariance Matrix --- p.22 / Chapter 4.3 --- Optimality Criteria --- p.29 / Chapter 5 --- Optimal Allocations --- p.32 / Chapter 5.1 --- Introduction --- p.32 / Chapter 5.2 --- Allocation for small sample size --- p.33 / Chapter 5.2.1 --- 2-stress-level case --- p.33 / Chapter 5.2.2 --- 4-stress-level case --- p.34 / Chapter 5.2.3 --- Suggested Optimal Allocation --- p.39 / Chapter 5.2.4 --- Comparison with the complete sample case --- p.43 / Chapter 5.3 --- Asymptotic Allocations --- p.44 / Chapter 6 --- Conclusions and Further Research --- p.50 / Bibliography --- p.52
|
6 |
The GARCH-EVT-Copula model and simulation in scenario-based asset allocationMcEwan, Peter Gareth Fredric January 2016 (has links)
Financial market integration, in particular, portfolio allocations from advanced economies to South African markets, continues to strengthen volatility linkages and quicken volatility transmissions between participating markets. Largely as a result, South African portfolios are net recipients of returns and volatility shocks emanating from major world markets. In light of these, and other, sources of risk, this dissertation proposes a methodology to improve risk management systems in funds by building a contemporary asset allocation framework that offers practitioners an opportunity to explicitly model combinations of hypothesised global risks and the effects on their investments. The framework models portfolio return variables and their key risk driver variables separately and then joins them to model their combined dependence structure. The separate modelling of univariate and multivariate (MV) components admits the benefit of capturing the data generating processes with improved accuracy. Univariate variables were modelled using ARMA-GARCH-family structures paired with a variety of skewed and leptokurtic conditional distributions. Model residuals were fit using the Peaks-over-Threshold method from Extreme Value Theory for the tails and a non-parametric, kernel density for the interior, forming a completed semi-parametric distribution (SPD) for each variable. Asset and risk factor returns were then combined and their dependence structure jointly modelled with a MV Student t copula. Finally, the SPD margins and Student t copula were used to construct a MV meta t distribution. Monte Carlo simulations were generated from the fitted MV meta t distribution on which an out-of-sample test was conducted. The 2014-to-2015 horizon served to proxy as an out-of-sample, forward-looking scenario for a set of key risk factors against which a hypothetical, diversified portfolio was optimised. Traditional mean-variance and contemporary mean-CVaR optimisation techniques were used and their results compared. As an addendum, performance over the in-sample 2008 financial crisis was reported. The final Objective (7) addressed management and conservation strategies for the NMBM. The NMBM wetland database that was produced during this research is currently being used by the Municipality and will be added to the latest National Wetland Map. From the database, and tools developed in this research, approximately 90 wetlands have been identified as being highly vulnerable due to anthropogenic and environmental factors (Chapter 6) and should be earmarked as key conservation priority areas. Based on field experience and data collected, this study has also made conservation and rehabilitation recommendations for eight locations. Recommendations are also provided for six more wetland systems (or regions) that should be prioritised for further research, as these systems lack fundamental information on where the threat of anthropogenic activities affecting them is greatest. This study has made a significant contribution to understanding the underlying geomorphological processes in depressions, seeps and wetland flats. The desktop mapping component of this study illustrated the dominance of wetlands in the wetter parts of the Municipality. Perched wetland systems were identified in the field, on shallow bedrock, calcrete or clay. The prevalence of these perches in depressions, seeps and wetland flats also highlighted the importance of rainfall in driving wetland formation, by allowing water to pool on these perches, in the NMBM. These perches are likely to be a key factor in the high number of small, ephemeral wetlands that were observed in the study area, compared to other semi-arid regions. Therefore, this research highlights the value of multi-faceted and multi-scalar wetland research and how similar approaches should be used in future research methods has been highlighted. The approach used, along with the tools/methods developed in this study have facilitated the establishment of priority areas for conservation and management within the NMBM. Furthermore, the research approach has revealed emergent wetland properties that are only apparent when looking at different spatial scales. This research has highlighted the complex biological and geomorphological interactions between wetlands that operate over various spatial and temporal scales. As such, wetland management should occur across a wetland complex, rather than individual sites, to account for these multi-scalar influences.
|
7 |
Statistical analysis of type-II progressively hybrid censored samples and adaptive type-II progressively hybrid censored samples from extreme value distribution.January 2009 (has links)
Mak, Man Yung. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2009. / Includes bibliographical references (leaves 115-117). / Abstracts in English and Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Introduction --- p.1 / Chapter 1.2 --- Conventional Censoring Schemes --- p.2 / Chapter 1.3 --- Type-II Progressively Hybrid Censoring Scheme --- p.4 / Chapter 1.4 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.6 / Chapter 1.5 --- Extreme Value Distribution --- p.8 / Chapter 1.6 --- The Scope of the Thesis --- p.11 / Chapter 2 --- Estimation methods --- p.12 / Chapter 2.1 --- Introduction --- p.12 / Chapter 2.2 --- Maximum Likelihood Estimators --- p.13 / Chapter 2.2.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.13 / Chapter 2.2.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.15 / Chapter 2.3 --- Approximate Maximum Likelihood Estimators --- p.18 / Chapter 2.3.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.18 / Chapter 2.3.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.20 / Chapter 2.4 --- Monte Carlo Simulation and Result --- p.23 / Chapter 2.4.1 --- Numerical Comparisons --- p.33 / Chapter 3 --- Construction of Confidence Intervals --- p.35 / Chapter 3.1 --- Introduction --- p.35 / Chapter 3.2 --- Asymptotic Confidence Interval --- p.36 / Chapter 3.2.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.37 / Chapter 3.2.2 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.39 / Chapter 3.3 --- Parametric Percentile Bootstrap Confidence Interval --- p.56 / Chapter 3.3.1 --- Parametric Percentile Bootstrap Confidence Interval based on Maximum Likelihood Estimation method --- p.57 / Chapter 3.3.2 --- Parametric Percentile Bootstrap Confidence Interval based on Approximate Maximum Likelihood Estimation method --- p.65 / Chapter 3.4 --- Parametric Bootstrap-t Confidence Interval --- p.71 / Chapter 3.4.1 --- Parametric Bootstrap-t Confidence Interval based on Maximum Likelihood Estimation method --- p.72 / Chapter 3.4.2 --- Parametric Bootstrap-t Confidence Interval based on Approxi mate Maximum Likelihood Estimation method --- p.79 / Chapter 3.5 --- Numerical Comparisons --- p.86 / Chapter 4 --- Expected Total Test Time --- p.88 / Chapter 4.1 --- Introduction --- p.88 / Chapter 4.2 --- Type-II Progressively Hybrid Censoring Scheme --- p.89 / Chapter 4.3 --- Adaptive Type-II Progressively Hybrid Censoring Scheme --- p.92 / Chapter 4.4 --- Numerical Comparisons --- p.99 / Chapter 5 --- Optimality Criteria and Censoring Schemes --- p.100 / Chapter 5.1 --- Introduction --- p.100 / Chapter 5.2 --- Optimality Criteria --- p.101 / Chapter 5.3 --- Expected Fisher Information Matrix --- p.102 / Chapter 5.3.1 --- Type-II Progressively Hybrid Censoring Scheme --- p.103 / Chapter 5.4 --- Optimal Censoring Scheme for Progressively Hybrid Censoring --- p.106 / Chapter 6 --- Conclusions and Further Research --- p.113 / Bibliography --- p.115
|
8 |
Thresholds for peak-over-threshold theoryAmankonah, Frank O. January 2005 (has links)
Thesis (M.S.)--University of Nevada, Reno, 2005. / "August, 2005." Includes bibliographical references (leaf 43). Online version available on the World Wide Web.
|
9 |
Statistical inference of a threshold model in extreme value analysisLee, David., 李大為. January 2012 (has links)
In many data sets, a mixture distribution formulation applies when it is
known that each observation comes from one of the underlying categories. Even
if there are no apparent categories, an implicit categorical structure may justify
a mixture distribution. This thesis concerns the modeling of extreme values in
such a setting within the peaks-over-threshold (POT) approach. Specifically,
the traditional POT modeling using the generalized Pareto distribution is augmented
in the sense that, in addition to threshold exceedances, data below the
threshold are also modeled by means of the mixture exponential distribution.
In the first part of this thesis, the conventional frequentist approach is
applied for data modeling. In view of the mixture nature of the problem,
the EM algorithm is employed for parameter estimation, where closed-form
expressions for the iterates are obtained. A simulation study is conducted to
confirm the suitability of such method, and the observation of an increase in
standard error due to the variability of the threshold is addressed. The model
is applied to two real data sets, and it is demonstrated how computation time
can be reduced through a multi-level modeling procedure. With the fitted
density, it is possible to derive many useful quantities such as return periods
and levels, value-at-risk, expected tail loss and bounds for ruin probabilities.
A likelihood ratio test is then used to justify model choice against the simpler
model where the thin-tailed distribution is homogeneous exponential.
The second part of the thesis deals with a fully Bayesian approach to the
same model. It starts with the application of the Bayesian idea to a special
case of the model where a closed-form posterior density is computed for the
threshold parameter, which serves as an introduction. This is extended to
the threshold mixture model by the use of the Metropolis-Hastings algorithm
to simulate samples from a posterior distribution known up to a normalizing
constant. The concept of depth functions is proposed in multidimensional
inference, where a natural ordering does not exist. Such methods are then
applied to real data sets. Finally, the issue of model choice is considered
through the use of posterior Bayes factor, a criterion that stems from the
posterior density. / published_or_final_version / Statistics and Actuarial Science / Master / Master of Philosophy
|
10 |
On tail behaviour and extremal values of some non-negative time seriesmodelsZhang, Zhiqiang, 張志強 January 2002 (has links)
published_or_final_version / abstract / toc / Statistics and Actuarial Science / Doctoral / Doctor of Philosophy
|
Page generated in 0.0588 seconds