• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 81
  • 14
  • 13
  • 7
  • 6
  • 5
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 141
  • 141
  • 141
  • 52
  • 32
  • 24
  • 21
  • 19
  • 19
  • 18
  • 18
  • 17
  • 17
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Applying RAROC, Value-at-Risk and Extreme Value Theory to Performance Measurement of Financial Holding Companies.

Chou, Cheng-Yi 07 July 2006 (has links)
none
12

Risk Measures and Dependence Modeling in Financial Risk Management

Eriksson, Kristofer January 2014 (has links)
In financial risk management it is essential to be able to model dependence in markets and portfolios in an accurate and efficient way. A high positive dependence between assets in a portfolio can be devastating, especially in times of crises, since losses will most likely occur at the same time in all assets for such a portfolio. The dependence is therefore directly linked to the risk of the portfolio. The risk can be estimated by several different risk measures, for example Value-at-Risk and Expected shortfall. This paper studies some different ways to measure risk and model dependence, both in a theoretical and empirical way. The main focus is on copulas, which is a way to model and construct complex dependencies. Copulas are a useful tool since it allows the user to separately specify the marginal distributions and then link them together with the copula. However, copulas can be quite complex to understand and it is not trivial to know which copula to use. An implemented copula model might give the user a "black-box" feeling and a severe model risk if the user trusts the model too much and is unaware of what is going. Another model would be to use the linear correlation which is also a way to measure dependence. This is an easier model and as such it is believed to be easier for all users to understand. However, linear correlation is only easy to understand in the case of elliptical distributions, and when we move away from this assumption (which is usually the case in financial data), some clear drawbacks and pitfalls become present. A third model, called historical simulation, uses the historical returns of the portfolio and estimate the risk on this data without making any parametric assumptions about the dependence. The dependence is assumed to be incorporated in the historical evolvement of the portfolio. This model is very easy and very popular, but it is more limited than the previous two models to the assumption that history will repeat itself and needs much more historical observations to yield good results. Here we face the risk that the market dynamics has changed when looking too far back in history. In this paper some different copula models are implemented and compared to the historical simulation approach by estimating risk with Value-at-Risk and Expected shortfall. The parameters of the copulas are also investigated under calm and stressed market periods. This information about the parameters is useful when performing stress tests. The empirical study indicates that it is difficult to distinguish the parameters between the stressed and calm market period. The overall conclusion is; which model to use depends on our beliefs about the future distribution. If we believe that the distribution is elliptical then a correlation model is good, if it is believed to have a complex dependence then the user should turn to a copula model, and if we can assume that history will repeat itself then historical simulation is advantageous.
13

Extreme Value Theory with an Application to Bank Failures through Contagion

Nikzad, Rashid 03 October 2011 (has links)
This study attempts to quantify the shocks to a banking network and analyze the transfer of shocks through the network. We consider two sources of shocks: external shocks due to market and macroeconomic factors which impact the entire banking system, and idiosyncratic shocks due to failure of a single bank. The external shocks will be estimated by using two methods: (i) non-parametric simulation of the time series of shocks that occurred to the banking system in the past, and (ii) using the extreme value theory (EVT) to model the tail part of the shocks. The external shocks we considered in this study are due to exchange rate and treasury bill rate volatility. Also, an ARMA/GARCH model is used to extract iid residuals for this purpose. In the next step, the probability of the failure of banks in the system is studied by using Monte Carlo simulation. We calibrate the model such that the network resembles the Canadian banking system.
14

Generalized extreme value and mixed logit models : empirical applications to vehicle accident severities /

Milton, John Calvin. January 2006 (has links)
Thesis (Ph. D.)--University of Washington, 2006. / Vita. Includes bibliographical references (leaves 87-96).
15

Fitting extreme value distributions to the Zambezi river flood water levels recorded at Katima Mulilo in Namibia

Kamwi, Innocent Silibelo January 2005 (has links)
Magister Scientiae - MSc / The aim of this research project was to estimate parameters for the distribution of annual maximum flood levels for the Zambezi River at Katima Mulilo. The estimation of parameters was done by using the maximum likelihood method. The study aimed to explore data of the Zambezi's annual maximum flood heights at Katima Mulilo by means of fitting the Gumbel, Weibull and the generalized extreme value distributions and evaluated their goodness of fit. / South Africa
16

Hurricane Loss Modeling and Extreme Quantile Estimation

Yang, Fan 26 January 2012 (has links)
This thesis reviewed various heavy tailed distributions and Extreme Value Theory (EVT) to estimate the catastrophic losses simulated from Florida Public Hurricane Loss Projection Model (FPHLPM). We have compared risk measures such as Probable Maximum Loss (PML) and Tail Value at Risk (TVaR) of the selected distributions with empirical estimation to capture the characteristics of the loss data as well as its tail distribution. Generalized Pareto Distribution (GPD) is the main focus for modeling the tail losses in this application. We found that the hurricane loss data generated from FPHLPM were consistent with historical losses and were not as heavy as expected. The tail of the stochastic annual maximum losses can be explained by an exponential distribution. This thesis also touched on the philosophical implication of small probability, high impact events such as Black Swan and discussed the limitations of quantifying catastrophic losses for future inference using statistical methods.
17

Extreme Value Theory with an Application to Bank Failures through Contagion

Nikzad, Rashid January 2011 (has links)
This study attempts to quantify the shocks to a banking network and analyze the transfer of shocks through the network. We consider two sources of shocks: external shocks due to market and macroeconomic factors which impact the entire banking system, and idiosyncratic shocks due to failure of a single bank. The external shocks will be estimated by using two methods: (i) non-parametric simulation of the time series of shocks that occurred to the banking system in the past, and (ii) using the extreme value theory (EVT) to model the tail part of the shocks. The external shocks we considered in this study are due to exchange rate and treasury bill rate volatility. Also, an ARMA/GARCH model is used to extract iid residuals for this purpose. In the next step, the probability of the failure of banks in the system is studied by using Monte Carlo simulation. We calibrate the model such that the network resembles the Canadian banking system.
18

Variational Open Set Recognition

Buquicchio, Luke J. 08 May 2020 (has links)
In traditional classification problems, all classes in the test set are assumed to also occur in the training set, also referred to as the closed-set assumption. However, in practice, new classes may occur in the test set, which reduces the performance of machine learning models trained under the closed-set assumption. Machine learning models should be able to accurately classify instances of classes known during training while concurrently recognizing instances of previously unseen classes (also called the open set assumption). This open set assumption is motivated by real world applications of classifiers wherein its improbable that sufficient data can be collected a priori on all possible classes to reliably train for them. For example, motivated by the DARPA WASH project at WPI, a disease classifier trained on data collected prior to the outbreak of COVID-19 might erroneously diagnose patients with the flu rather than the novel coronavirus. State-of-the-art open set methods based on the Extreme Value Theory (EVT) fail to adequately model class distributions with unequal variances. We propose the Variational Open-Set Recognition (VOSR) model that leverages all class-belongingness probabilities to reject unknown instances. To realize the VOSR model, we design a novel Multi-Modal Variational Autoencoder (MMVAE) that learns well-separated Gaussian Mixture distributions with equal variances in its latent representation. During training, VOSR maps instances of known classes to high-probability regions of class-specific components. By enforcing a large distance between these latent components during training, VOSR then assumes unknown data lies in the low-probability space between components and uses a multivariate form of Extreme Value Theory to reject unknown instances. Our VOSR framework outperforms state-of-the-art open set classification methods with a 15% F1 score increase on a variety of benchmark datasets.
19

Market Timing strategy through Reinforcement Learning

HE, Xuezhong January 2021 (has links)
This dissertation implements an optimal trading strategy based on the machine learning method and extreme value theory (EVT) to obtain an excess return on investments in the capital market. The trading strategy outperforms the benchmark S&P 500 index with higher returns and lower volatility through effective market timing. In addition, this dissertation starts by modeling the market tail risk using the EVT and reinforcement learning methods, distinguishing from the traditional value at risk method. In this dissertation, I used EVT to extract the characteristics of the tail risk, which are inputs for reinforcement learning. This process is proved to be effective in market timing, and the trading strategy could avoid market crash and achieve a long-term excess return. In sum, this study has several contributions. First, this study takes a new method to analyze stock price (in this dissertation, I use the S&P 500 index as a stock). I combined the EVT and reinforcement learning to study the price tail risk and predict stock crash efficiently, which is a new method for tail risk research. Thus, I can predict the stock crash or provide the probability of risk, and then, the trading strategy can be built. The second contribution is that this dissertation provides a dynamic market timing trading strategy, which can significantly outperform the market index with a lower volatility and a higher Sharpe ratio. Moreover, the dynamic trading process can provide investors an intuitive sense on the stock market and help in decision-making. Third, the success of the strategy shows that the combination of EVT and reinforcement learning can predict the stock crash very well, which is a great improvement on the extreme event study and deserves further study. / Business Administration/Finance
20

Flexible Extremal Dependence Models for Multivariate and Spatial Extremes

Zhang, Zhongwei 11 1900 (has links)
Classical models for multivariate or spatial extremes are mainly based upon the asymptotically justified max-stable or generalized Pareto processes. These models are suitable when asymptotic dependence is present. However, recent environmental data applications suggest that asymptotic independence is equally important. Therefore, development of flexible subasymptotic models is in pressing need. This dissertation consists of four major contributions to subasymptotic modeling of multivariate and spatial extremes. Firstly, the dissertation proposes a new spatial copula model for extremes based on the multivariate generalized hyperbolic distribution. The extremal dependence of this distribution is revisited and a corrected theoretical description is provided. Secondly, the dissertation thoroughly investigates the extremal dependence of stochastic processes driven by exponential-tailed Lévy noise. It shows that the discrete approximation models, which are linear transformations of a random vector with independent components, bridge asymptotic independence and asymptotic dependence in a novel way, whilst the exact stochastic processes exhibit only asymptotic independence. Thirdly, the dissertation explores two different notions of optimal prediction for extremes, and compares the classical linear kriging predictor and the conditional mean predictor for certain non-Gaussian models. Finally, the dissertation proposes a multivariate skew-elliptical link model for correlated highly-imbalanced (extreme) binary responses, and shows that the regression coefficients have a closed-form unified skew-elliptical posterior with an elliptical prior.

Page generated in 0.0454 seconds