• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 119
  • 27
  • 19
  • 13
  • 10
  • 9
  • 7
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 232
  • 232
  • 151
  • 61
  • 58
  • 41
  • 36
  • 32
  • 29
  • 27
  • 26
  • 24
  • 23
  • 23
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Market Timing strategy through Reinforcement Learning

HE, Xuezhong January 2021 (has links)
This dissertation implements an optimal trading strategy based on the machine learning method and extreme value theory (EVT) to obtain an excess return on investments in the capital market. The trading strategy outperforms the benchmark S&P 500 index with higher returns and lower volatility through effective market timing. In addition, this dissertation starts by modeling the market tail risk using the EVT and reinforcement learning methods, distinguishing from the traditional value at risk method. In this dissertation, I used EVT to extract the characteristics of the tail risk, which are inputs for reinforcement learning. This process is proved to be effective in market timing, and the trading strategy could avoid market crash and achieve a long-term excess return. In sum, this study has several contributions. First, this study takes a new method to analyze stock price (in this dissertation, I use the S&P 500 index as a stock). I combined the EVT and reinforcement learning to study the price tail risk and predict stock crash efficiently, which is a new method for tail risk research. Thus, I can predict the stock crash or provide the probability of risk, and then, the trading strategy can be built. The second contribution is that this dissertation provides a dynamic market timing trading strategy, which can significantly outperform the market index with a lower volatility and a higher Sharpe ratio. Moreover, the dynamic trading process can provide investors an intuitive sense on the stock market and help in decision-making. Third, the success of the strategy shows that the combination of EVT and reinforcement learning can predict the stock crash very well, which is a great improvement on the extreme event study and deserves further study. / Business Administration/Finance
32

Flexible Extremal Dependence Models for Multivariate and Spatial Extremes

Zhang, Zhongwei 11 1900 (has links)
Classical models for multivariate or spatial extremes are mainly based upon the asymptotically justified max-stable or generalized Pareto processes. These models are suitable when asymptotic dependence is present. However, recent environmental data applications suggest that asymptotic independence is equally important. Therefore, development of flexible subasymptotic models is in pressing need. This dissertation consists of four major contributions to subasymptotic modeling of multivariate and spatial extremes. Firstly, the dissertation proposes a new spatial copula model for extremes based on the multivariate generalized hyperbolic distribution. The extremal dependence of this distribution is revisited and a corrected theoretical description is provided. Secondly, the dissertation thoroughly investigates the extremal dependence of stochastic processes driven by exponential-tailed Lévy noise. It shows that the discrete approximation models, which are linear transformations of a random vector with independent components, bridge asymptotic independence and asymptotic dependence in a novel way, whilst the exact stochastic processes exhibit only asymptotic independence. Thirdly, the dissertation explores two different notions of optimal prediction for extremes, and compares the classical linear kriging predictor and the conditional mean predictor for certain non-Gaussian models. Finally, the dissertation proposes a multivariate skew-elliptical link model for correlated highly-imbalanced (extreme) binary responses, and shows that the regression coefficients have a closed-form unified skew-elliptical posterior with an elliptical prior.
33

Modelování operačního rizika / Operational risk modelling

Mináriková, Eva January 2013 (has links)
In the present thesis we will firstly familiarize ourselves with the term of operational risk, it's definition presented in the directives Basel II and Solvency II, and afterwards with the methods of calculation Capital Requirements for Operational Risk, set by these directives. In the second part of the thesis we will concentrate on the methods of modelling operational loss data. We will introduce the Extreme Value Theory which describes possible approaches to modelling data with significant values that occur infrequently; the typical characteristic of operational risk data. We will mainly focus on the model for threshold exceedances which utilizes Generalized Pareto Distribution to model the distribution of those excesses. The teoretical knowledge of this theory and the appropriate modelling will be applied on simulated loss data. Finally we will test the ability of presented methods to model loss data distributions.
34

A distribuição normal-valor extremo generalizado para a modelagem de dados limitados no intervalo unitá¡rio (0,1) / The normal-generalized extreme value distribution for the modeling of data restricted in the unit interval (0,1)

Benites, Yury Rojas 28 June 2019 (has links)
Neste trabalho é introduzido um novo modelo estatístico para modelar dados limitados no intervalo continuo (0;1). O modelo proposto é construído sob uma transformação de variáveis, onde a variável transformada é resultado da combinação de uma variável com distribuição normal padrão e a função de distribuição acumulada da distribuição valor extremo generalizado. Para o novo modelo são estudadas suas propriedades estruturais. A nova família é estendida para modelos de regressão, onde o modelo é reparametrizado na mediana da variável resposta e este conjuntamente com o parâmetro de dispersão são relacionados com covariáveis através de uma função de ligação. Procedimentos inferênciais são desenvolvidos desde uma perspectiva clássica e bayesiana. A inferência clássica baseia-se na teoria de máxima verossimilhança e a inferência bayesiana no método de Monte Carlo via cadeias de Markov. Além disso estudos de simulação foram realizados para avaliar o desempenho das estimativas clássicas e bayesianas dos parâmetros do modelo. Finalmente um conjunto de dados de câncer colorretal é considerado para mostrar a aplicabilidade do modelo. / In this research a new statistical model is introduced to model data restricted in the continuous interval (0;1). The proposed model is constructed under a transformation of variables, in which the transformed variable is the result of the combination of a variable with standard normal distribution and the cumulative distribution function of the generalized extreme value distribution. For the new model its structural properties are studied. The new family is extended to regression models, in which the model is reparametrized in the median of the response variable and together with the dispersion parameter are related to covariables through a link function. Inferential procedures are developed from a classical and Bayesian perspective. The classical inference is based on the theory of maximum likelihood, and the Bayesian inference is based on the Markov chain Monte Carlo method. In addition, simulation studies were performed to evaluate the performance of the classical and Bayesian estimates of the model parameters. Finally a set of colorectal cancer data is considered to show the applicability of the model
35

Microstructure-sensitive extreme value probabilities of fatigue in advanced engineering alloys

Przybyla, Craig Paul 07 July 2010 (has links)
A novel microstructure-sensitive extreme value probabilistic framework is introduced to evaluate material performance/variability for damage evolution processes (e.g., fatigue, fracture, creep). This framework employs newly developed extreme value marked correlation functions (EVMCF) to identify the coupled microstructure attributes (e.g., phase/grain size, grain orientation, grain misorientation) that have the greatest statistical relevance to the extreme value response variables (e.g., stress, elastic/plastic strain) that describe the damage evolution processes of interest. This is an improvement on previous approaches that account for distributed extreme value response variables that describe the damage evolution process of interest based only on the extreme value distributions of a single microstructure attribute; previous approaches have given no consideration of how coupled microstructure attributes affect the distributions of extreme value response. This framework also utilizes computational modeling techniques to identify correlations between microstructure attributes that significantly raise or lower the magnitudes of the damage response variables of interest through the simulation of multiple statistical volume elements (SVE). Each SVE for a given response is constructed to be a statistical sample of the entire microstructure ensemble (i.e., bulk material); therefore, the response of interest in each SVE is not expected to be the same. This is in contrast to computational simulation of a single representative volume element (RVE), which often is untenably large for response variables dependent on the extreme value microstructure attributes. This framework has been demonstrated in the context of characterizing microstructure-sensitive high cycle fatigue (HCF) variability due to the processes of fatigue crack formation (nucleation and microstructurally small crack growth) in polycrystalline metallic alloys. Specifically, the framework is exercised to estimate the local driving forces for fatigue crack formation, to validate these with limited existing experiments, and to explore how the extreme value probabilities of certain fatigue indicator parameters (FIPs) affect overall variability in fatigue life in the HCF regime. Various FIPs have been introduced and used previously as a means to quantify the potential for fatigue crack formation based on experimentally observed mechanisms. Distributions of the extreme value FIPs are calculated for multiple SVEs simulated via the FEM with crystal plasticity constitutive relations. By using crystal plasticity relations, the FIPs can be computed based on the cyclic plastic strain on the scale of the individual grains. These simulated SVEs are instantiated such that they are statistically similar to real microstructures in terms of the crystallographic microstructure attributes that are hypothesized to have the most influence on the extreme value HCF response. The polycrystalline alloys considered here include the Ni-base superalloy IN100 and the Ti alloy Ti-6Al-4V. In applying this framework to study the microstructure dependent variability of HCF in these alloys, the extreme value distributions of the FIPs and associated extreme value marked correlations of crystallographic microstructure attributes are characterized. This information can then be used to rank order multiple variants of the microstructure for a specific material system for relative HCF performance or to design new microstructures hypothesized to exhibit improved performance. This framework enables limiting the (presently) large number of experiments required to characterize scatter in HCF and lends quantitative support to designing improved, fatigue-resistant materials and accelerating insertion of modified and new materials into service.
36

Využití teorie extrémních hodnot při řízení operačních rizik / Extreme Value Theory in Operational Risk Management

Vojtěch, Jan January 2009 (has links)
Currently, financial institutions are supposed to analyze and quantify a new type of banking risk, known as operational risk. Financial institutions are exposed to this risk in their everyday activities. The main objective of this work is to construct an acceptable statistical model of capital requirement computation. Such a model must respect specificity of losses arising from operational risk events. The fundamental task is represented by searching for a suitable distribution, which describes the probabilistic behavior of losses arising from this type of risk. There is a strong utilization of the Pickands-Balkema-de Haan theorem used in extreme value theory. Roughly speaking, distribution of a random variable exceeding a given high threshold, converges in distribution to generalized Pareto distribution. The theorem is subsequently used in estimating the high percentile from a simulated distribution. The simulated distribution is considered to be a compound model for the aggregate loss random variable. It is constructed as a combination of frequency distribution for the number of losses random variable and the so-called severity distribution for individual loss random variable. The proposed model is then used to estimate a fi -nal quantile, which represents a searched amount of capital requirement. This capital requirement is constituted as the amount of funds the bank is supposed to retain, in order to make up for the projected lack of funds. There is a given probability the capital charge will be exceeded, which is commonly quite small. Although a combination of some frequency distribution and some severity distribution is the common way to deal with the described problem, the final application is often considered to be problematic. Generally, there are some combinations for severity distribution of two or three, for instance, lognormal distributions with different location and scale parameters. Models like these usually do not have any theoretical background and in particular, the connecting of distribution functions has not been conducted in the proper way. In this work, we will deal with both problems. In addition, there is a derivation of maximum likelihood estimates of lognormal distribution for which hold F_LN(u) = p, where u and p is given. The results achieved can be used in the everyday practices of financial institutions for operational risks quantification. In addition, they can be used for the analysis of a variety of sample data with so-called heavy tails, where standard distributions do not offer any help. As an integral part of this work, a CD with source code of each function used in the model is included. All of these functions were created in statistical programming language, in S-PLUS software. In the fourth annex, there is the complete description of each function and its purpose and general syntax for a possible usage in solving different kinds of problems.
37

Neparametrické metody odhadu parametrů rozdělení extrémního typu / Non-parametric estimation of parameters of extreme value distribution

Blachut, Vít January 2013 (has links)
The concern of this diploma thesis is extreme value distributions. The first part formulates and proves the limit theorem for distribution of maximum. Further there are described basic properties of class of extreme value distributions. The key role of this thesis is on non-parametric estimations of extreme value index. Primarily, Hill and moment estimator are derived, for which is, based on the results of mathematical analysis, suggested an alternative choice of optimal sample fraction using a bootstrap based method. The estimators of extreme value index are compared based on simulations from proper chosen distributions, being close to distribution of given rain-fall data series. This time series is recommended a suitable estimator and suggested choice of optimal sample fraction, which belongs to the most difficult task in the area of extreme value theory.
38

Metody odhadu parametrů rozdělení extrémního typu s aplikacemi / Extreme Value Distribution Parameter Estimation and its Application

Holešovský, Jan January 2016 (has links)
The thesis is focused on extreme value theory and its applications. Initially, extreme value distribution is introduced and its properties are discussed. At this basis are described two models mostly used for an extreme value analysis, i.e. the block maxima model and the Pareto-distribution threshold model. The first one takes advantage in its robustness, however recently the threshold model is mostly preferred. Although the threshold choice strongly affects estimation quality of the model, an optimal threshold selection still belongs to unsolved issues of this approach. Therefore, the thesis is focused on techniques for proper threshold identification, mainly on adaptive methods suitable for the use in practice. For this purpose a simulation study was performed and acquired knowledge was applied for analysis of precipitation records from South-Moravian region. Further on, the thesis also deals with extreme value estimation within a stationary series framework. Usually, an observed time series needs to be separated to obtain approximately independent observations. The use of the advanced theory for stationary series allows to avoid the entire separation procedure. In this context the commonly applied separation techniques turn out to be quite inappropriate in most cases and the estimates based on theory of stationary series are obtained with better precision.
39

Managing the extremes : An application of extreme value theory to financial risk management

Strömqvist, Zakris, Petersen, Jesper January 2016 (has links)
We compare the traditional GARCH models with a semiparametric approach based on extreme value theory and find that the semiparametric approach yields more accurate predictions of Value-at-Risk (VaR). Using traditional parametric approaches based on GARCH and EGARCH to model the conditional volatility, we calculate univariate one-day ahead predictions of Value-at-Risk (VaR) under varying distributional assumptions. The accuracy of these predictions is then compared to that of a semiparametric approach, based on results from extreme value theory. For the 95% VaR, the EGARCH’s ability to incorporate the asymmetric behaviour of return volatility proves most useful. For higher quantiles, however, we show that what matters most for predictive accuracy is the underlying distributional assumption of the innovations, where the normal distribution falls behind other distributions which allow for thicker tails. Both the semiparametric approach and the conditional volatility models based on the t-distribution outperform the normal, especially at higher quantiles. As for the comparison between the semiparametric approach and the conditional volatility models with t-distributed innovations, the results are mixed. However, the evidence indicates that there certainly is a place for extreme value theory in financial risk management.
40

Improved estimation procedures for a positive extreme value index

Berning, Thomas Louw 12 1900 (has links)
Thesis (PhD (Statistics))--University of Stellenbosch, 2010. / ENGLISH ABSTRACT: In extreme value theory (EVT) the emphasis is on extreme (very small or very large) observations. The crucial parameter when making inferences about extreme quantiles, is called the extreme value index (EVI). This thesis concentrates on only the right tail of the underlying distribution (extremely large observations), and specifically situations where the EVI is assumed to be positive. A positive EVI indicates that the underlying distribution of the data has a heavy right tail, as is the case with, for example, insurance claims data. There are numerous areas of application of EVT, since there are a vast number of situations in which one would be interested in predicting extreme events accurately. Accurate prediction requires accurate estimation of the EVI, which has received ample attention in the literature from a theoretical as well as practical point of view. Countless estimators of the EVI exist in the literature, but the practitioner has little information on how these estimators compare. An extensive simulation study was designed and conducted to compare the performance of a wide range of estimators, over a wide range of sample sizes and distributions. A new procedure for the estimation of a positive EVI was developed, based on fitting the perturbed Pareto distribution (PPD) to observations above a threshold, using Bayesian methodology. Attention was also given to the development of a threshold selection technique. One of the major contributions of this thesis is a measure which quantifies the stability (or rather instability) of estimates across a range of thresholds. This measure can be used to objectively obtain the range of thresholds over which the estimates are most stable. It is this measure which is used for the purpose of threshold selection for the proposed PPD estimator. A case study of five insurance claims data sets illustrates how data sets can be analyzed in practice. It is shown to what extent discretion can/should be applied, as well as how different estimators can be used in a complementary fashion to give more insight into the nature of the data and the extreme tail of the underlying distribution. The analysis is carried out from the point of raw data, to the construction of tables which can be used directly to gauge the risk of the insurance portfolio over a given time frame. / AFRIKAANSE OPSOMMING: Die veld van ekstreemwaardeteorie (EVT) is bemoeid met ekstreme (baie klein of baie groot) waarnemings. Die parameter wat deurslaggewend is wanneer inferensies aangaande ekstreme kwantiele ter sprake is, is die sogenaamde ekstreemwaarde-indeks (EVI). Hierdie verhandeling konsentreer op slegs die regterstert van die onderliggende verdeling (baie groot waarnemings), en meer spesifiek, op situasies waar aanvaar word dat die EVI positief is. ’n Positiewe EVI dui aan dat die onderliggende verdeling ’n swaar regterstert het, wat byvoorbeeld die geval is by versekeringseis data. Daar is verskeie velde waar EVT toegepas word, aangesien daar ’n groot aantal situasies is waarin mens sou belangstel om ekstreme gebeurtenisse akkuraat te voorspel. Akkurate voorspelling vereis die akkurate beraming van die EVI, wat reeds ruim aandag in die literatuur geniet het, uit beide teoretiese en praktiese oogpunte. ’n Groot aantal beramers van die EVI bestaan in die literatuur, maar enige persoon wat die toepassing van EVT in die praktyk beoog, het min inligting oor hoe hierdie beramers met mekaar vergelyk. ’n Uitgebreide simulasiestudie is ontwerp en uitgevoer om die akkuraatheid van beraming van ’n groot verskeidenheid van beramers in die literatuur te vergelyk. Die studie sluit ’n groot verskeidenheid van steekproefgroottes en onderliggende verdelings in. ’n Nuwe prosedure vir die beraming van ’n positiewe EVI is ontwikkel, gebaseer op die passing van die gesteurde Pareto verdeling (PPD) aan waarnemings wat ’n gegewe drempel oorskrei, deur van Bayes tegnieke gebruik te maak. Aandag is ook geskenk aan die ontwikkeling van ’n drempelseleksiemetode. Een van die hoofbydraes van hierdie verhandeling is ’n maatstaf wat die stabiliteit (of eerder onstabiliteit) van beramings oor verskeie drempels kwantifiseer. Hierdie maatstaf bied ’n objektiewe manier om ’n gebied (versameling van drempelwaardes) te verkry waaroor die beramings die stabielste is. Dit is hierdie maatstaf wat gebruik word om drempelseleksie te doen in die geval van die PPD beramer. ’n Gevallestudie van vyf stelle data van versekeringseise demonstreer hoe data in die praktyk geanaliseer kan word. Daar word getoon tot watter mate diskresie toegepas kan/moet word, asook hoe verskillende beramers op ’n komplementêre wyse ingespan kan word om meer insig te verkry met betrekking tot die aard van die data en die stert van die onderliggende verdeling. Die analise word uitgevoer vanaf die punt waar slegs rou data beskikbaar is, tot op die punt waar tabelle saamgestel is wat direk gebruik kan word om die risiko van die versekeringsportefeulje te bepaal oor ’n gegewe periode.

Page generated in 0.0482 seconds