• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • 1
  • Tagged with
  • 6
  • 6
  • 6
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Efficient Risk Simulations for Linear Asset Portfolios

Sak, Halis, Hörmann, Wolfgang, Leydold, Josef January 2008 (has links) (PDF)
We consider the problem of calculating tail probabilities of the returns of linear asset portfolios. As flexible and accurate model for the logarithmic returns we use the $t$-copula dependence structure and marginals following the generalized hyperbolic distribution. Exact calculation of the tail-loss probabilities is not possible and even simulation leads to challenging numerical problems. Applying a new numerical inversion method for the generation of the marginals and importance sampling with carefully selected mean shift we develop an efficient simulation algorithm. Numerical results for a variety of realistic portfolio examples show an impressive performance gain. (author´s abstract) / Series: Research Report Series / Department of Statistics and Mathematics
2

Efficient Numerical Inversion for Financial Simulations

Derflinger, Gerhard, Hörmann, Wolfgang, Leydold, Josef, Sak, Halis January 2009 (has links) (PDF)
Generating samples from generalized hyperbolic distributions and non-central chi-square distributions by inversion has become an important task for the simulation of recent models in finance in the framework of (quasi-) Monte Carlo. However, their distribution functions are quite expensive to evaluate and thus numerical methods like root finding algorithms are extremely slow. In this paper we demonstrate how our new method based on Newton interpolation and Gauss-Lobatto quadrature can be utilized for financial applications. Its fast marginal generation times make it competitive, even for situations where the parameters are not always constant. / Series: Research Report Series / Department of Statistics and Mathematics
3

On Clustering: Mixture Model Averaging with the Generalized Hyperbolic Distribution

Ricciuti, Sarah 11 1900 (has links)
Cluster analysis is commonly described as the classification of unlabeled observations into groups such that they are more similar to one another than to observations in other groups. Model-based clustering assumes that the data arise from a statistical (mixture) model and typically a group of many models are fit to the data, from which the `best' model is selected by a model selection criterion (often the BIC in mixture model applications). This chosen model is then the only model that is used for making inferences on the data. Although this is common practice, proceeding in this way ignores a large component of model selection uncertainty, especially for situations where the difference between the model selection criterion for two competing models is relatively insignificant. For this reason, recent interest has been placed on selecting a subset of models that are close to the selected best model and using a weighted averaging approach to incorporate information from multiple models in this set. Model averaging is not a novel approach, yet its presence in a clustering framework is minimal. Here, we use Occam's window to select a subset of models eligible for two types of averaging techniques: averaging a posteriori probabilities, and direct averaging of model parameters. The efficacy of these model-based averaging approaches is demonstrated for a family of generalized hyperbolic mixture models using real and simulated data. / Thesis / Master of Science (MSc)
4

Cornish-Fisher Expansion and Value-at-Risk method in application to risk management of large portfolios

Sjöstrand, Maria, Aktaş, Özlem January 2011 (has links)
One of the major problem faced by banks is how to manage the risk exposure in large portfolios. According to Basel II regulation banks has to measure the risk using Value-at-Risk with confidence level 99%. However, this regulation does not specify the way to calculate Valueat- Risk. The easiest way to calculate Value-at-Risk is to assume that portfolio returns are normally distributed. Altough, this is the most common way to calculate Value-at-Risk, there exists also other methods. The previous crisis shows that the regular methods are unfortunately not always enough to prevent bankruptcy. This paper is devoted to compare the classical methods of estimating risk with other methods such as Cornish-Fisher Expansion (CFVaR) and assuming generalized hyperbolic distribution. To be able to do this study, we estimate the risk in a large portfolio consisting of ten stocks. These stocks are chosen from the NASDAQ 100-list in order to have highly liquid stocks (bluechips). The stocks are chosen from different sectors to make the portfolio welldiversified. To investigate the impact of dependence between the stocks in the portfolio we remove the two most correlated stocks and consider the resulting eight stock portfolio as well. In both portfolios we put equal weight to the included stocks. The results show that for a well-diversified large portfolio none of the risk measures are violated. However, for a portfolio consisting of only one highly volatile stock we prove that we have a violation in the classical methods but not when we use the modern methods mentioned above.
5

A new approach to pricing real options on swaps : a new solution technique and extension to the non-a.s. finite stopping realm

Chu, Uran 07 June 2012 (has links)
This thesis consists of extensions of results on a perpetual American swaption problem. Companies routinely plan to swap uncertain benefits with uncertain costs in the future for their own benefits. Our work explores the choice of timing policies associated with the swap in the form of an optimal stopping problem. In this thesis, we have shown that Hu, Oksendal's (1998) condition given in their paper to guarantee that the optimal stopping time is a.s. finite is in fact both a necessary and sufficient condition. We have extended the solution to the problem from a region in the parameter space where optimal stopping times are a.s. finite to a region where optimal stopping times are non-a.s. finite, and have successfully calculated the probability of never stopping in this latter region. We have identified the joint distribution for stopping times and stopping locations in both the a.s. and non-a.s. finite stopping cases. We have also come up with an integral formula for the inner product of a generalized hyperbolic distribution with the Cauchy distribution. Also, we have applied our results to a back-end forestry harvesting model where stochastic costs are assumed to exponentiate upwards to infinity through time. / Graduation date: 2013
6

Seguro contra risco de downside de uma carteira: uma proposta híbrida frequentista-Bayesiana com uso de derivativos

Pérgola, Gabriel Campos 23 January 2013 (has links)
Submitted by Gabriel Campos Pérgola (gabrielpergola@gmail.com) on 2013-02-04T12:56:43Z No. of bitstreams: 1 DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) / Rejected by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br), reason: Prezado Gabriel, Não recebemos os arquivo em PDF. Att. Suzi 3799-7876 on 2013-02-05T18:53:00Z (GMT) / Submitted by Gabriel Campos Pérgola (gabrielpergola@gmail.com) on 2013-02-05T19:00:17Z No. of bitstreams: 2 DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2013-02-05T19:07:12Z (GMT) No. of bitstreams: 2 DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) / Made available in DSpace on 2013-02-05T19:09:04Z (GMT). No. of bitstreams: 2 DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) DissertationGabrielPergola2013.pdf: 521205 bytes, checksum: 85369078a82b0d5cc02f8248961e9214 (MD5) Previous issue date: 23-01-13 / Portfolio insurance allows a manager to limit downside risk while allowing participation in upside markets. The purpose of this dissertation is to introduce a framework to portfolio insurance optimization from a hybrid frequentist-Bayesian approach. We obtain the joint distribution of regular returns from a frequentist statistical method, once the outliers have been identified and removed from the data sample. The joint distribution of extreme returns, in its turn, is modelled by a Bayesian network, whose topology reflects the events that can significantly impact the portfolio performance. Once we link the regular and extreme distributions of returns, we simulate future scenarios for the portfolio value. The insurance subportfolio is then optimized by the Differential Evolution algorithm. We show the framework in a step by step example for a long portfolio including stocks participating in the Bovespa Index (Ibovespa), using market data from 2008 to 2012. / Seguros de carteiras proporcionam aos gestores limitar o risco de downside sem renunciar a movimentos de upside. Nesta dissertação, propomos um arcabouço de otimização de seguro de carteira a partir de um modelo híbrido frequentista-Bayesiano com uso de derivativos. Obtemos a distribuição conjunta de retornos regulares através de uma abordagem estatística frequentista, uma vez removidos os outliers da amostra. A distribuição conjunta dos retornos extremos, por sua vez, é modelada através de Redes Bayesianas, cuja topologia contempla os eventos que o gestor considera crítico ao desempenho da carteira. Unindo as distribuições de retornos regulares e extremos, simulamos cenários futuros para a carteira. O seguro é, então, otimizado através do algoritmo Evolução Diferencial. Mostramos uma aplicação passo a passo para uma carteira comprada em ações do Ibovespa, utilizando dados de mercado entre 2008 e 2012.

Page generated in 0.1684 seconds