• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 53
  • 19
  • 5
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 99
  • 73
  • 67
  • 56
  • 55
  • 21
  • 15
  • 15
  • 14
  • 14
  • 13
  • 12
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Asset Allocation Based on Shortfall Risk

Čumova, Denisa 27 July 2005 (has links) (PDF)
In der Dissertation wurde ein innovatives Portfoliomodell entwickelt, welches den Präferenzen einer großen Gruppe von Investoren entspricht, die mit der traditionellen Portfolio Selektion auf Basis von Mittelwertrendite und Varianz nicht zufrieden sind. Vor allem bezieht sich die Unzufriedenheit auf eine sehr spezifische Definition der Risiko- und Wertmaße, die angenommene Nutzenfunktion, die Risikodiversifizierung sowie die Beschränkung des Assetuniversums. Dies erschwert vor allem die Optimierung der modernen Finanzprodukte. Das im Modell verwendete Risikomaß-Ausfallrisiko drückt die Präferenzen der Investoren im Bereich unterhalb der Renditebenchmark aus. Die Renditenabweichung von der Benchmark nach oben werden nicht, wie im Falle des Mittelwertrendite-Varianz-Portfoliomodells, minimiert oder als risikoneutral, wie bei dem Mittelwertrendite-Ausfallrisiko-Portfoliomodell, betrachtet. Stattdessen wird ein Wertmaß, das Chance-Potenzial (Upper Partial Moment), verwendet, mit welchem verschiedene Investorenwünsche in diesem Bereich darstellbar sind. Die Eliminierung der Annahme der normalverteilten Renditen in diesem Chance-Potenzial-Ausfallrisiko-Portfoliomodell erlaubt eine korrekte Asset Allokation auch im Falle der nicht normalverteilten Renditen, die z. B. Finanzderivate, Aktien, Renten und Immobilien zu finden sind. Bei diesen tendiert das traditionelle Mittelwertrendite-Varianz-Portfoliomodell zu suboptimalen Entscheidungen. Die praktische Anwendung des Chance-Potenzial-Ausfallrisiko-Portfoliomodells wurde am Assetuniversum von Covered Calls, Protective Puts und Aktien gezeigt. / This thesis presents an innovative portfolio model appropriate for a large group of investors which are not content with the asset allocation with the traditional, mean return-variance based portfolio model above all in term of its rather specific definition of the risk and value decision parameters, risk diversification, related utility function and its restrictions imposed on the asset universe. Its modifiable risk measure ─ shortfall risk ─ expresses variable risk preferences below the return benchmark. The upside return deviations from the benchmark are not minimized as in case of the mean return-variance portfolio model or considered risk neutral as in the mean return-shortfall risk portfolio model, but employs variable degrees of the chance potential (upper partial moments) in order to provide investors with broader range of utility choices and so reflect arbitrary preferences. The elimination of the assumption of normally distributed returns in the chance potential-shortfall risk model allows correct allocation of assets with non-normally distributed returns as e.g. financial derivatives, equities, real estates, fixed return assets, commodities where the mean-variance portfolio model tends to inferior asset allocation decisions. The computational issues of the optimization algorithm developed for the mean-variance, mean-shortfall risk and chance potential-shortfall risk portfolio selection are described to ease their practical application. Additionally, the application of the chance potential-shortfall risk model is shown on the asset universe containing stocks, covered calls and protective puts.
52

A generalized Neyman-Pearson lemma for hedge problems in incomplete markets

Rudloff, Birgit 07 October 2005 (has links) (PDF)
Some financial problems as minimizing the shortfall risk when hedging in incomplete markets lead to problems belonging to test theory. This paper considers a generalization of the Neyman-Pearson lemma. With methods of convex duality we deduce the structure of an optimal randomized test when testing a compound hypothesis against a simple alternative. We give necessary and sufficient optimality conditions for the problem.
53

Efficient Simulations in Finance

Sak, Halis January 2008 (has links) (PDF)
Measuring the risk of a credit portfolio is a challenge for financial institutions because of the regulations brought by the Basel Committee. In recent years lots of models and state-of-the-art methods, which utilize Monte Carlo simulation, were proposed to solve this problem. In most of the models factors are used to account for the correlations between obligors. We concentrate on the the normal copula model, which assumes multivariate normality of the factors. Computation of value at risk (VaR) and expected shortfall (ES) for realistic credit portfolio models is subtle, since, (i) there is dependency throughout the portfolio; (ii) an efficient method is required to compute tail loss probabilities and conditional expectations at multiple points simultaneously. This is why Monte Carlo simulation must be improved by variance reduction techniques such as importance sampling (IS). Thus a new method is developed for simulating tail loss probabilities and conditional expectations for a standard credit risk portfolio. The new method is an integration of IS with inner replications using geometric shortcut for dependent obligors in a normal copula framework. Numerical results show that the new method is better than naive simulation for computing tail loss probabilities and conditional expectations at a single x and VaR value. Finally, it is shown that compared to the standard t statistic a skewness-correction method of Peter Hall is a simple and more accurate alternative for constructing confidence intervals. (author´s abstract) / Series: Research Report Series / Department of Statistics and Mathematics
54

Utilização de cópulas com dinâmica semiparamétrica para estimação de medidas de risco de mercado

Silveira Neto, Paulo Corrêa da January 2015 (has links)
A análise de risco de mercado, o risco associado a perdas financeiras resultantes de utilizações de preços de mercado, é fundamental para instituições financeiras e gestores de carteiras. A alocação dos ativos nas carteiras envolve decisões risco/retorno eficientes, frequentemente limitadas por uma política de risco. Muitos modelos tradicionais simplificam a estimação do risco de mercado impondo muitas suposições, como distribuições simétricas, correlações lineares, normalidade, entre outras. A utilização de cópulas exibiliza a estimação da estrutura de dependência dessas séries de tempo, possibilitando a modelagem de séries de tempo multivariadas em dois passos: estimações marginais e da dependência entre as séries. Neste trabalho, utilizou-se um modelo de cópulas com dinâmica semiparamétrica para medição de risco de mercado. A estrutura dinâmica das cópulas conta com um parâmetro de dependência que varia ao longo do tempo, em que a proposta semiparamétrica possibilita a modelagem de qualquer tipo de forma funcional que a estrutura dinâmica venha a apresentar. O modelo proposto por Hafner e Reznikova (2010), de dinâmica semiparamétrica, é comparado com o modelo sugerido por Patton (2006), que apresenta dinâmica paramétrica. Todas as cópulas no trabalho são bivariadas. Os dados consistem em quatro séries de tempo do mercado brasileiro de ações. Para cada um desses pares, utilizou-se modelos ARMA-GARCH para a estimação das marginais, enquanto a dependência entre as séries foi estimada utilizando os dois modelos de cópulas dinâmicas mencionados. Para comparar as metodologias estimaram-se duas medidas de risco de mercado: Valor em Risco e Expected Shortfall. Testes de hipóteses foram implementados para verificar a qualidade das estimativas de risco. / Market risk management, i.e. managing the risk associated with nancial loss resulting from market price uctuations, is fundamental to nancial institutions and portfolio managers. Allocations involve e cient risk/return decisions, often restricted by an investment policy statement. Many traditional models simplify risk estimation imposing several assumptions, like symmetrical distributions, the existence of only linear correlations, normality, among others. The modelling of the dependence structure of these time series can be exibly achieved by using copulas. This approach can model a complex multivariate time series structure by analyzing the problem in two blocks: marginal distributions estimation and dependence estimation. The dynamic structure of these copulas can account for a dependence parameter that changes over time, whereas the semiparametric option makes it possible to model any kind of functional form in the dynamic structure. We compare the model suggested by Hafner and Reznikova (2010), which is a dynamic semiparametric one, with the model suggested by Patton (2006), which is also dynamic but fully parametric. The copulas in this work are all bivariate. The data consists of four Brazilian stock market time series. For each of these pairs, ARMA-GARCH models have been used to model the marginals, while the dependences between the series are modeled by using the two methods mentioned above. For the comparison between these methodologies, we estimate Value at Risk and Expected Shortfall of the portfolios built for each pair of assets. Hypothesis tests are implemented to verify the quality of the risk estimates.
55

Stress, uncertainty and multimodality of risk measures / Stress, incertitude et multimodalité des mesures de risque

Li, Kehan 06 June 2017 (has links)
Dans cette thèse, nous discutons du stress, de l'incertitude et de la multimodalité des mesures de risque en accordant une attention particulière à deux parties. Les résultats ont une influence directe sur le calcul du capital économique et réglementaire des banques. Tout d'abord, nous fournissons une nouvelle mesure de risque - la VaR du stress du spectre (SSVaR) - pour quantifier et intégrer l'incertitude de la valeur à risque. C'est un modèle de mise en œuvre de la VaR stressée proposée par Bâle III. La SSVaR est basée sur l'intervalle de confiance de la VaR. Nous étudions la distribution asymptotique de la statistique de l'ordre, qui est un estimateur non paramétrique de la VaR, afin de construire l'intervalle de confiance. Deux intervalles de confiance sont obtenus soit par le résultat gaussien asymptotique, soit par l'approche saddlepoint. Nous les comparons avec l'intervalle de confiance en bootstrapping par des simulations, montrant que l'intervalle de confiance construit à partir de l'approche saddlepoint est robuste pour différentes tailles d'échantillons, distributions sous-jacentes et niveaux de confiance. Les applications de test de stress utilisant SSVaR sont effectuées avec des rendements historiques de l'indice boursier lors d'une crise financière, pour identifier les violations potentielles de la VaR pendant les périodes de turbulences sur les marchés financiers. Deuxièmement, nous étudions l'impact de la multimodalité des distributions sur les calculs de la VaR et de l'ES. Les distributions de probabilité unimodales ont été largement utilisées pour le calcul paramétrique de la VaR par les investisseurs, les gestionnaires de risques et les régulateurs. Cependant, les données financières peuvent être caractérisées par des distributions ayant plus d'un mode. Avec ces données nous montrons que les distributions multimodales peuvent surpasser la distribution unimodale au sens de la qualité de l'ajustement. Deux catégories de distributions multimodales sont considérées: la famille de Cobb et la famille Distortion. Nous développons un algorithme d'échantillonnage de rejet adapté, permettant de générer efficacement des échantillons aléatoires à partir de la fonction de densité de probabilité de la famille de Cobb. Pour une étude empirique, deux ensembles de données sont considérés: un ensemble de données quotidiennes concernant le risque opérationnel et un scénario de trois mois de rendement du portefeuille de marché construit avec cinq minutes de données intraday. Avec un éventail complet de niveaux de confiance, la VaR et l'ES à la fois des distributions unimodales et des distributions multimodales sont calculés. Nous analysons les résultats pour voir l'intérêt d'utiliser la distribution multimodale au lieu de la distribution unimodale en pratique. / In this thesis, we focus on discussing the stress, uncertainty and multimodality of risk measures with special attention on two parts. The results have direct influence on the computation of bank economic and regulatory capital. First, we provide a novel risk measure - the Spectrum Stress VaR (SSVaR) - to quantify and integrate the uncertainty of the Value-at-Risk. It is an implementation model of stressed VaR proposed in Basel III. The SSVaR is based on the confidence interval of the VaR. We investigate the asymptotic distribution of the order statistic, which is a nonparametric estimator of the VaR, in order to build the confidence interval. Two confidence intervals are derived from either the asymptotic Gaussian result, or the saddlepoint approach. We compare them with the bootstrapping confidence interval by simulations, showing that the confidence interval built from the saddlepoint approach is robust for different sample sizes, underlying distributions and confidence levels. Stress testing applications using SSVaR are performed with historical stock index returns during financial crisis, for identifying potential violations of the VaR during turmoil periods on financial markets. Second, we investigate the impact of multimodality of distributions on VaR and ES calculations. Unimodal probability distributions have been widely used for parametric VaR computation by investors, risk managers and regulators. However, financial data may be characterized by distributions having more than one modes. For these data, we show that multimodal distributions may outperform unimodal distribution in the sense of goodness-of-fit. Two classes of multimodal distributions are considered: Cobb's family and Distortion family. We develop an adapted rejection sampling algorithm, permitting to generate random samples efficiently from the probability density function of Cobb's family. For empirical study, two data sets are considered: a daily data set concerning operational risk and a three month scenario of market portfolio return built with five minutes intraday data. With a complete spectrum of confidence levels, the VaR and the ES from both unimodal distributions and multimodal distributions are calculated. We analyze the results to see the interest of using multimodal distribution instead of unimodal distribution in practice.
56

Utilização de cópulas com dinâmica semiparamétrica para estimação de medidas de risco de mercado

Silveira Neto, Paulo Corrêa da January 2015 (has links)
A análise de risco de mercado, o risco associado a perdas financeiras resultantes de utilizações de preços de mercado, é fundamental para instituições financeiras e gestores de carteiras. A alocação dos ativos nas carteiras envolve decisões risco/retorno eficientes, frequentemente limitadas por uma política de risco. Muitos modelos tradicionais simplificam a estimação do risco de mercado impondo muitas suposições, como distribuições simétricas, correlações lineares, normalidade, entre outras. A utilização de cópulas exibiliza a estimação da estrutura de dependência dessas séries de tempo, possibilitando a modelagem de séries de tempo multivariadas em dois passos: estimações marginais e da dependência entre as séries. Neste trabalho, utilizou-se um modelo de cópulas com dinâmica semiparamétrica para medição de risco de mercado. A estrutura dinâmica das cópulas conta com um parâmetro de dependência que varia ao longo do tempo, em que a proposta semiparamétrica possibilita a modelagem de qualquer tipo de forma funcional que a estrutura dinâmica venha a apresentar. O modelo proposto por Hafner e Reznikova (2010), de dinâmica semiparamétrica, é comparado com o modelo sugerido por Patton (2006), que apresenta dinâmica paramétrica. Todas as cópulas no trabalho são bivariadas. Os dados consistem em quatro séries de tempo do mercado brasileiro de ações. Para cada um desses pares, utilizou-se modelos ARMA-GARCH para a estimação das marginais, enquanto a dependência entre as séries foi estimada utilizando os dois modelos de cópulas dinâmicas mencionados. Para comparar as metodologias estimaram-se duas medidas de risco de mercado: Valor em Risco e Expected Shortfall. Testes de hipóteses foram implementados para verificar a qualidade das estimativas de risco. / Market risk management, i.e. managing the risk associated with nancial loss resulting from market price uctuations, is fundamental to nancial institutions and portfolio managers. Allocations involve e cient risk/return decisions, often restricted by an investment policy statement. Many traditional models simplify risk estimation imposing several assumptions, like symmetrical distributions, the existence of only linear correlations, normality, among others. The modelling of the dependence structure of these time series can be exibly achieved by using copulas. This approach can model a complex multivariate time series structure by analyzing the problem in two blocks: marginal distributions estimation and dependence estimation. The dynamic structure of these copulas can account for a dependence parameter that changes over time, whereas the semiparametric option makes it possible to model any kind of functional form in the dynamic structure. We compare the model suggested by Hafner and Reznikova (2010), which is a dynamic semiparametric one, with the model suggested by Patton (2006), which is also dynamic but fully parametric. The copulas in this work are all bivariate. The data consists of four Brazilian stock market time series. For each of these pairs, ARMA-GARCH models have been used to model the marginals, while the dependences between the series are modeled by using the two methods mentioned above. For the comparison between these methodologies, we estimate Value at Risk and Expected Shortfall of the portfolios built for each pair of assets. Hypothesis tests are implemented to verify the quality of the risk estimates.
57

A importância de se levar em conta a lacuna linneana no planejamento de conservação dos anfíbios no Brasil / The importance of taking into account the linnean shortfall on amphibian conservation planning

Moreira, Mateus Atadeu 28 April 2015 (has links)
Submitted by Luciana Ferreira (lucgeral@gmail.com) on 2016-07-14T14:59:37Z No. of bitstreams: 2 Dissertação - Mateus Atadeu Moreira - 2015.pdf: 3009996 bytes, checksum: 1a2c29fc3dae91ef371d08eaa0f6b9e2 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2016-07-14T15:01:03Z (GMT) No. of bitstreams: 2 Dissertação - Mateus Atadeu Moreira - 2015.pdf: 3009996 bytes, checksum: 1a2c29fc3dae91ef371d08eaa0f6b9e2 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2016-07-14T15:01:03Z (GMT). No. of bitstreams: 2 Dissertação - Mateus Atadeu Moreira - 2015.pdf: 3009996 bytes, checksum: 1a2c29fc3dae91ef371d08eaa0f6b9e2 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2015-04-28 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / We only have described a small fraction of the world’s biodiversity. The influence of how much we know biodiversity and how that hinders our strategies for conserving it is a genuine and weighty concern. Brazil has the greatest amphibian diversity in the world with 1026 amphibian species, and the number of described species is increasing at a high rate. It is likely that many of these Brazilian amphibians still undescribed are threatened. Although many new species are being described in Brazil some protected areas are being downsized or downgraded. In this study we aim to analyze how much the Linnean shortfall impairs our ability to prioritize areas for the conservation of a highly diverse and still poorly known group such as the Brazilian amphibians, and whether the main conservation strategy in Brazil is prepared to deal with this shortfall. We made four spatial prioritizations of the known Brazilian amphibians of four arbitrarily chosen scenarios (1980, 1990, 2000 and 2013), then we overlapped these prioritizations with the existing federal protected areas of each scenario, and compared the results, calculating the proportions of the high priority areas that changed place and the proportion of high priority areas federally protected at each scenario. In the first change of scenario 921 of the 4672 cells that comprises the seventeen per cent of highest priority cells in 1980 changed place to form the seventeen per cent most priority cells in 1990 (19,71% of the cells). On the 1990-2000 change 905 of the 4686 cells changed place (19,31%) and on the last change of scenario (2000-2013) 983 of the 4675 (21,01%) highest priority cells has changed place. The percentage of these highest priority areas that was federally protected at each scenario and in each of the biomes was severely low in all cases, but is maidenly rising with time. The new protected areas created between the scenarios (both the strict protection and sustainable use areas) do not follow the new priority areas. It is crucial that Brazilian taxonomy continues to grow. Since Brazil is so important for the future of the global diversity of amphibians is also crucial a systematic planning of new protected areas, using scientific models to account for the Linnean shortfall, in order to protect such an astonishing diversity. Keywords: Linnean shortfall, Biodiversity, Conservation, Brazilian Amphibians, Spatial Conservation Prioritization / A ciência descreveu menos da metade do número total de espécies existentes. A influência do quanto conhecemos ou não a biodiversidade e o quanto isso pesa nas nossas estratégias de conservação é uma preocupação genuína e séria. O Brasil possui a maior diversidade de anfíbios do mundo com 1026 espécies, e esse número tem crescido em uma alta taxa. É provável que muitos dos anfíbios que ainda não foram descritos no Brasil estejam ameaçados. Enquanto muitas espécies têm sido descritas ultimamente o Brasil está diminuindo o tamanho e o grau de proteção de diversas áreas protegidas. Nesse estudo avaliamos o quanto a lacuna Linneana afeta a definição de áreas prioritárias para a conservação de um grupo altamente diverso e ainda pobremente conhecido como os anfíbios do Brasil, e se a principal estratégia de conservação brasileira está preparada pra lidar com essa lacuna. Nós fizemos quatro priorizações espaciais da fauna de anfíbios conhecida no Brasil em quatro cenários no tempo (1980, 1990, 2000 e 2013), em seguida sobrepusemos essas priorizações com as Unidades de Conservação Federais de cada um desses cenários e comparamos os resultados. Calculamos o quanto a configuração das áreas mais prioritárias mudou de um cenário para outro e o quanto das áreas mais prioritárias estava protegido em cada cenário. De 1980 para 1990, 921 das 4672 células mais prioritárias mudaram de lugar (19,71% das células). De 1990 para 2000, 905 das 4686 células mudaram (19,31%) e na última mudança de cenário (2000-2013) 983 das 4675 (21,01%) tiveram mudança espacial. A proporção dessas áreas mais importantes para conservação que estava protegida em cada cenário e em cada bioma foi muito pequena, mas está crescendo modestamente com o tempo. As novas Unidades de conservação criadas entre os cenários não acompanham as novas áreas mais prioritárias que surgem com o acréscimo das novas espécies. Sendo o Brasil tão importante para a proteção da tão ameaçada fauna global de anfíbios é crucial que exista um planejamento sistemático de novas áreas protegidas, e que esse planejamento use modelos científicos para levar em conta a lacuna Linneana.
58

Dopady nových regulatorních požadavků na tržní riziko / Impacts of new regulatory requirements for market risk

Vojkůvka, Adam January 2017 (has links)
The aim of this master thesis is analyze the impact of new regulatory requirements for market risk in terms of internal approach of the selected portfolio. The first part deals with the definition and calculation methods of risk measures Value at Risk and Expected Shortfall. Furthermore, this part is dedicated to model backtesting and determination of the stress period. The second part describes the development of Basel I-III regulatory requirements for market risk with a focus on internal approaches. The third part focuses on the calculation and subsequent analysis of current and new regulatory reguirements for market risk using the historical simulation method, variance and covariance method and Monte Carlo simulation.
59

Monte Carlo Simulations of Portfolios Allocated with Structured Products : A method to see the effect on risk and return for long time horizons

Fredriksson, Malin January 2018 (has links)
Structured products are complex non-linear financial instruments that make it difficult to calculate their future risk and return. Two categories of structured products are Capital Protected and Participation notes, which are built by bonds and options. Since the structured products are non-linear, it is difficult to asses their long-term risk today. This study, conducted at Nordea Markets, focuses on the risk of structured products and how the risk and return in a portfolio changes when we include structured products into it. Nordea can only calculate the one-year risk with their current risk advisory tool, which makes long time predictions difficult. To solve this problem, we have simulated portfolios and structured products over a five-year time horizon with the Monte Carlo method. To investigate how the structured product allocations behave in different conditions, we have developed three test methods and a ranking program. The first test method measures how different underlying assets changes the risk and return in the portfolio allocations. The second test method varies the drift, volatility, and correlation for both the underlying asset and the portfolio to see how these parameters changes the risk and return. The third test method simulates a crisis market with high correlations and low drift. All these tests go through the ranking program, the most important part, where the different allocations are compared against the original portfolio to decide when the allocations perform better. The ranking is based on multiple risk measures, but the focus in this study is at using Expected Shortfall for risk while the expected return is used for ranking the return. We used five different reference portfolios and six different structured products with specific parameters in an example run where the ranking program and all three test methods are used. We found that the properties of the reference portfolio and the structured product’s underlying are significant and affect the performance the most. In the example run it was possible to find preferable cases for all structured products but some performed better than others. The test methods revealed many aspects of portfolio allocation with structured products, such as the decrease in portfolio risk for Capital Protected notes and increase in portfolio return for Participation notes. Our ranking program proved to be useful in the sense that it simplifies the result interpretations.
60

Portfolio Value at Risk and Expected Shortfall using High-frequency data / Portfólio Value at Risk a Expected Shortfall s použitím vysoko frekvenčních dat

Zváč, Marek January 2015 (has links)
The main objective of this thesis is to investigate whether multivariate models using Highfrequency data provide significantly more accurate forecasts of Value at Risk and Expected Shortfall than multivariate models using only daily data. Our objective is very topical since the Basel Committee announced in 2013 that is going to change the risk measure used for calculation of capital requirement from Value at Risk to Expected Shortfall. The further improvement of accuracy of both risk measures can be also achieved by incorporation of high-frequency data that are rapidly more available due to significant technological progress. Therefore, we employed parsimonious Heterogeneous Autoregression and its asymmetric version that uses high-frequency data for the modeling of realized covariance matrix. The benchmark models are chosen well established DCC-GARCH and EWMA. The computation of Value at Risk (VaR) and Expected Shortfall (ES) is done through parametric, semi-parametric and Monte Carlo simulations. The loss distributions are represented by multivariate Gaussian, Student t, multivariate distributions simulated by Copula functions and multivariate filtered historical simulations. There are used univariate loss distributions: Generalized Pareto Distribution from EVT, empirical and standard parametric distributions. The main finding is that Heterogeneous Autoregression model using high-frequency data delivered superior or at least the same accuracy of forecasts of VaR to benchmark models based on daily data. Finally, the backtesting of ES remains still very challenging and applied Test I. and II. did not provide credible validation of the forecasts.

Page generated in 0.029 seconds