• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 183
  • 109
  • 40
  • 29
  • 23
  • 18
  • 18
  • 13
  • 11
  • 10
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 483
  • 483
  • 483
  • 87
  • 85
  • 75
  • 74
  • 67
  • 66
  • 64
  • 61
  • 59
  • 55
  • 55
  • 48
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
381

Causalidade Granger em medidas de risco / Granger Causality with Risk Measures

Patricia Nagami Murakami 02 May 2011 (has links)
Esse trabalho apresenta um estudo da causalidade de Granger em Risco bivariado aplicado a séries temporais financeiras. Os eventos de risco, no caso de séries financeiras, estão relacionados com a avaliação do Valor em Risco das posições em ativos. Para isso, os modelos CaViaR, que fazem parte do grupo de modelos de Regressão Quantílica, foram utilizado para identificação desses eventos. Foram expostos os conceitos principais envolvidos da modelagem, assim como as definições necessárias para entendê-las. Através da análise da causalide de Granger em risco entre duas séries, podemos investigar se uma delas é capaz de prever a ocorrência de um valor extremo da outra. Foi realizada a análise de causalidade de Granger usual somente para como comparativo. / Quantile Regression, Value at Risk, CAViaR Model, Granger Causality, Granger Causality in Risk
382

Ensaios em modelagem de dependência em séries financeiras multivariadas utilizando cópulas

Tófoli, Paula Virgínia January 2013 (has links)
O presente trabalho foi motivado pela forte demanda por modelos de dependência mais precisos e realistas para aplicações a dados financeiros multivariados. A recente crise financeira de 2007-2009 deixou claro quão importante é uma modelagem precisa da dependência para a avaliação correta do risco financeiro: percepções equivocadas sobre dependências extremas entre diferentes ativos foram um elemento importante da crise do subprime. O famoso teorema dc Sklar (1959) introduziu as cópulas como uma ferramenta para se modelar padrões de dependência mais sofisticados. Ele estabelece que qualquer função de distribuição conjunta ndimensional pode ser decomposta em suas n distribuições marginais e uma cópula, sendo que a última caracteriza completamente a dependência entre as variáveis. Enquanto existe uma variedade de famílias de cópulas bivariadas que podem descrever um amplo conjunto de dependências complexas, o conjunto de cópulas com dimensão mais elevada era bastante restrito até recentemente. Joe (1996) propôs uma construção de distribuições nmltivariadas baseada em pair-copulas (cópulas bivariadas), chamada pair-copula construction ou modelo de vine cópula, que reverteu esse problema. Nesta tese, desenvolvemos três ensaios que exploram a teoria de cópulas para obter modelos de dependência multivariados muito flexíveis para aplicações a dados financeiros. Patton (2006) estendeu o teorema de Sklar para o caso de distribuições condicionais e tornou o parâmetro de dependência da cópula variante no tempo. No primeiro ensaio, introduzimos um novo enfoque para modelar a dependência entre retornos financeiros internacionais ao longo do tempo, combinando cópulas; tempo-variantes e o modelo de mudança Markoviana. Aplicamos esses modelos de cópula e também os modelos propostos por Patton (2006), Jondeau e Rockinger (2006) e Silva Filho et al. (2012a) aos retornos dos índices FTSE 100, CAC 40 e DAX. Comparamos essas metodologias em termos das dinâmicas de dependência resultantes e das habilidades dos modelos em prever Valor em Risco (VaR). Interessantemente, todos os modelos identificam um longo período de alta dependência entre os retornos começando em 2007, quando a crise do subprime teve início oficialmente. Surpreendentemente, as cópulas elípticas mostram melhor desempenho na previsão dos quantis extremos dos retornos dos portfólios. No segundo ensaio, estendemos nosso estudo para o caso de n > 2 variáveis, usando o modelo de vine cópula para investigar a estrutura de dependência dos índices CAC 40, DAX, FTSE 100, S&P 500 e IBOVESPA, e, particularmente, checar a hipótese de dependência assimétrica nesse caso. Com base em nossos resultados empíricos, entretanto, essa hipótese não pode ser verificada. Talvez a dependência assimétrica com caudas inferiores mais fortes ocorra apenas temporariamente, o que sugere que a incorporação de variação temporal ao modelo de vine cópula pode melhorá-lo como ferramenta para modelar dados financeiros internacionais multivariados. Desta forma, no terceiro ensaio, introduzimos dinâmica no modelo de vine cópula permitindo que os parâmetros de dependência das pair-copulas em uma decomposição D-vine sejam potencialmente variantes no tempo, seguindo um processo ARMA(l,m) restrito como em Patton (2006). O modelo proposto é avaliado em simulações e também com respeito à acurácia das previsões de Valor em Risco (VaR) em períodos de crise. Os experimentos de Monte Cailo são bastante favoráveis à cópula D-vine dinâmica em comparação a uma cópula D-vine estática. Adicionalmente, a cópula D-vine dinâmica supera a cópula D-vine estática em termos de acurária preditiva para os nossos conjuntos de dados / This work was motivated by the strong demand for more precise and realistic dependence models for applications to multivariate financial data. The recent financial crisis of 2007-2009 has made it clear how important is a precise modeling of dependence for the accurate assessment of financial risk: misperceptions about extreme dependencies between different financial assets were an important element of the subprime crisis. The famous theorem by Sklar (1959) introduced the copulas as a tool to model more intricate patterns of dependence. It states that any n-dimensional joint distribution function can be decomposed into its n marginal distributions and a copula, where the latter completely characterizes the dependence among the variables. While there is a variety of bivariate copula families, which can match a wide range of complex dependencies, the set of higher-dimensional copulas was quite restricted until recently. Joe (1996) proposed a construction of multivariate distributions based on pair-copulas (bivariate copulas), called pair-copula construction or vine copula model, that has overcome this issue. In this thesis, we develop three papers that explore the copula theory in order to obtain very flexible multivariate dependence rnodels for applications to financial data. Patton (2006) extended Sklar's theorem to the conditional case and rendered the dependence parameter of the copula time-varying. In the first paper, we introduce a new approach to modeling dependence between International financial returns over time, combining time-varying copulas and the Markov switching model. We apply these copula models and also those proposed by Patton (2006), Jondeau and Rockinger (2006) and Silva Filho et al. (2012a) to the return data of FTSE 100, CAC 40 and DAX indexes. We compare these methodologies in terms of the resulting dynamics of dependence and the models' abilities to forecast Value-at-Risk (VaR). Interestingly, ali the models identify a long period of high dependence between the returns beginning in 2007, when the subprime crisis was evolving. Surprisingly, the elhptical copulas perform best in forecasting the extreme quantiles of the portfolios returns. In the second paper, we extend our study to the case of n > 2 variables, using the vine copula model to investigate the dependence structure of the broad stock market indexes CAC 40, DAX, FTSE 100, S&P 500 and IBOVESPA, and, particularly, check the asymmetric dependence hypothesis in this case. Based on our empirical results, however, this hypothesis cannot be verified. Perhaps, asymmetric dependence with stronger lower tails occurs only temporarily, what suggests that incorporating time variation into the vine copula rnodel can improve it as a tool to rnodel multivariate International financial data. So, in the third paper, we introduce dynamics into the vine copula model by allowing the dependence parameters of the pair-copulas in a D-vine decomposition to be potentially timevarying, following a nonlinear restricted ARMA(l,m) process as in Patton (2006). The proposed model is evaluated in simulations and further assessed with respect to the accuracy of Value-at- Risk (VaR) forecasts in crisis periods. The Monte Cario experiments are quite favorable to the dynamic D-vine copula in comparison with a static D-vine copula. Moreover, the dynamic Dvine copula outperforms the static D-vine copula in terms of predictive accuracy for our data sets.
383

Value at risk e expectes shortfall: medidas de risco e suas propriedades: um estudo empírico para o mercado brasileiro

Moraes, Camila Corrêa 29 January 2013 (has links)
Submitted by Camila Corrêa Moraes (camila.cmoraes@gmail.com) on 2013-02-24T03:00:19Z No. of bitstreams: 1 DISSERTAÇÃO CAMILA MORAES.pdf: 4708711 bytes, checksum: 3c2acb024f3dbcde7627bb8afea462fd (MD5) / Rejected by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br), reason: Prezada Camila, Seu titulo não confere com a Ata, não podemos aprovar o trabalho, pois não temos informação do orientador (verso da Ata) da mudança do título. Aguardo email do seu orientador informando a alteração e posteriormente o professor deve assinar o verso da ata. Att. Suzi 3799-7876 on 2013-02-25T15:26:27Z (GMT) / Submitted by Camila Corrêa Moraes (camila.cmoraes@gmail.com) on 2013-02-26T17:46:32Z No. of bitstreams: 1 DISSERTAÇÃO CAMILA MORAES.pdf: 4708711 bytes, checksum: 3c2acb024f3dbcde7627bb8afea462fd (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2013-02-26T17:50:59Z (GMT) No. of bitstreams: 1 DISSERTAÇÃO CAMILA MORAES.pdf: 4708711 bytes, checksum: 3c2acb024f3dbcde7627bb8afea462fd (MD5) / Made available in DSpace on 2013-02-26T18:41:00Z (GMT). No. of bitstreams: 1 DISSERTAÇÃO CAMILA MORAES.pdf: 4708711 bytes, checksum: 3c2acb024f3dbcde7627bb8afea462fd (MD5) Previous issue date: 2013-01-29 / Value at Risk (VaR) and Expected Shortfall (ES) are quantitative models to measure market risk of financial assets portfolios. The purpose of this study is to evaluate the results of these models for a portfolio traded in the Brazilian market through four backtesting methods - Basel Traffic Light Test, Kupiec Test, Christoffersen Test and McNeil and Frey Test - covering periods of domestic (2002) and international (2008) financial crisis. The VaR model described here presents two approaches - Parametric, where it is assumed that the distribution of asset returns follow a Normal, and Historical Simulation, where there are no assumption about the distribution of asset returns, but it is assumed that they are independent and identically distributed. The results of VaR were also evaluated with the Cornish-Fisher expansion, which tries to approximate the empirical distribution to a Normal distribution using the values of skewness and kurtosis. Another feature observed was the property of coherence, which evaluates if the risk measure follows four basic axioms - monotonicity, translation invariance, homogeneity and subadditivity. VaR is not considered a coherent risk measure because it doesn´t follow the subadditivity feature in all cases. On the other hand the ES follows the four axioms, thus considered a coherent risk measure. The ES model was evaluated according to the Parametric Normal approach. This work also verified through backtests, if the property of coherency improves the accuracy of the analyzed risk measures / Value at Risk (VaR) e Expected Shortfall (ES) são modelos quantitativos para mensuração do risco de mercado em carteiras de ativos financeiros. O propósito deste trabalho é avaliar os resultados de tais modelos para ativos negociados no mercado brasileiro através de quatro metodologias de backtesting - Basel Traffic Light Test, Teste de Kupiec, Teste de Christoffersen e Teste de McNeil e Frey – abrangendo períodos de crise financeira doméstica (2002) e internacional (2008). O modelo de VaR aqui apresentado utilizou duas abordagens – Paramétrica Normal, onde se assume que a distribuição dos retornos dos ativos segue uma Normal, e Simulação Histórica, onde não há hipótese a respeito da distribuição dos retornos dos ativos, porém assume-se que os mesmos são independentes e identicamente distribuídos. Também foram avaliados os resultados do VaR com a expansão de Cornish-Fisher, a qual visa aproximar a distribuição empírica a uma distribuição Normal utilizando os valores de curtose e assimetria para tal. Outra característica observada foi a propriedade de coerência, a qual avalia se a medida de risco obedece a quatro axiomas básicos – monotonicidade, invariância sob translações, homogeneidade e subaditividade. O VaR não é considerado uma medida de risco coerente, pois não apresenta a característica de subaditividade em todos os casos. Por outro lado o ES obedece aos quatro axiomas, considerado assim uma medida coerente. O modelo de ES foi avaliado segundo a abordagem Paramétrica Normal. Neste trabalho também se verificou através dos backtests, o quanto a propriedade de coerência de uma medida de risco melhora sua precisão.
384

Modelagem de perdas com ações trabalhistas em instituições financeiras

Rachman, Luciano 07 August 2013 (has links)
Submitted by Luciano Rachman (lucianora@uol.com.br) on 2013-09-03T14:15:04Z No. of bitstreams: 1 Dissertacao_Luciano_Rachman.pdf: 1167975 bytes, checksum: da1c59096eda72630b44358c1d1e0b0f (MD5) / Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2013-09-03T14:52:47Z (GMT) No. of bitstreams: 1 Dissertacao_Luciano_Rachman.pdf: 1167975 bytes, checksum: da1c59096eda72630b44358c1d1e0b0f (MD5) / Made available in DSpace on 2013-09-03T15:05:37Z (GMT). No. of bitstreams: 1 Dissertacao_Luciano_Rachman.pdf: 1167975 bytes, checksum: da1c59096eda72630b44358c1d1e0b0f (MD5) Previous issue date: 2013-08-07 / As perdas trabalhistas nas Instituições Financeiras representam um valor considerável que devem ser consideradas no modelo de capital regulatório para risco operacional, segundo Basileia. A presente dissertação demonstra uma forma de mensurar o risco às quais as Instituições Financeiras estão expostas nesse tipo de perdas. Diversos tipos de distribuições são analisados conforme sua aderência tanto na frequência como na severidade das perdas. Para os valores de frequência, foi obtida uma amostra de dados real, enquanto para a severidade foram utilizados valores obtidos de relatórios de instituto de pesquisa que serviram de insumo para os cálculos de ações trabalhistas conforme legislação brasileira vigente na CLT (Consolidação das Leis do Trabalho). / According to Basel, the labor losses in Financial Institutions represent a substantial value that should be regarded in the model of regulatory capital for operational risk. This dissertation demonstrates a way to measure the risk to which Financial Institutions are exposed to in this type of loss. Several types of distributions are analyzed according to their adherence both in frequency and severity of losses. For frequency values, it was obtained a sample of actual data, whilst for the severity were used values given from reports of research institute which served as an input for the calculations of labor actions according to the present Brazilian legislation in CLT (Consolidation of Labor Laws).
385

Tail Empirical Processes: Limit Theorems and Bootstrap Techniques, with Applications to Risk Measures

Loukrati, Hicham 07 May 2018 (has links)
Au cours des dernières années, des changements importants dans le domaine des assurances et des finances attirent de plus en plus l’attention sur la nécessité d’élaborer un cadre normalisé pour la mesure des risques. Récemment, il y a eu un intérêt croissant de la part des experts en assurance sur l’utilisation de l’espérance conditionnelle des pertes (CTE) parce qu’elle partage des propriétés considérées comme souhaitables et applicables dans diverses situations. En particulier, il répond aux exigences d’une mesure de risque “cohérente”, selon Artzner [2]. Cette thèse représente des contributions à l’inférence statistique en développant des outils, basés sur la convergence des intégrales fonctionnelles, pour l’estimation de la CTE qui présentent un intérêt considérable pour la science actuarielle. Tout d’abord, nous développons un outil permettant l’estimation de la moyenne conditionnelle E[X|X > x], ensuite nous construisons des estimateurs de la CTE, développons la théorie asymptotique nécessaire pour ces estimateurs, puis utilisons la théorie pour construire des intervalles de confiance. Pour la première fois, l’approche de bootstrap non paramétrique est explorée dans cette thèse en développant des nouveaux résultats applicables à la valeur à risque (VaR) et à la CTE. Des études de simulation illustrent la performance de la technique de bootstrap.
386

Methods of optimizing investment portfolios

Seepi, Thoriso P.J. January 2013 (has links)
>Magister Scientiae - MSc / In this thesis, we discuss methods for optimising the expected rate of return of a portfolio with minimal risk. As part of the work we look at the Modern Portfolio Theory which tries to maximise the portfolio's expected rate of return for a cer- tain amount of risk. We also use Quadratic Programming to optimise portfolios. Generally it is recognised that portfolios with a high expected return, carry higher risk. The Modern Portfolio Theory assists when choosing portfolios with the lowest possible risk. There is a nite number of assets in a portfolio and we therefore want to allocate them in such a way that we're able to optimise the expected rate of return with minimal risk. We also use the Markowian approach to allocate these assets. The Capital Asset Pricing Model is also used, which will help us to reduce our e cient portfolio to a single portfolio. Furthermore we use the Black-Litterman model to try and optimise our portfolio with a view to understanding the current market conditions, as well as considering how the market will perform in the future. An additional tool we'll use is Value at Risk. This enables us to manage the market risk. To this end, we follow the three basic approaches from Jorion [Value at Risk. USA: McGraw-Hills, 2001]. The Value at Risk tool has become essential in calcu- lating a portfolio's risk over the last decade. It works by monitoring algorithms in order to nd the worst possible scenarios within the portfolio. We perform several numerical experiments in MATLAB and Microsoft Excel and these are presented in the thesis with the relevant descriptions.
387

Řízení kurzového rizika výrobního podniku / Hedging of currency risk of manufacturing company

Fomina, Elena January 2017 (has links)
This thesis has an aim to create a hedging strategy for currency risks for exporting company. The main reason for hedging are possible losses that can be triggered by changes in exchange rate. In the case of exchange rate changes exporting company may face three different types of exposure: transaction, translation and economic exposure. This thesis concentrates on transaction exposure and builds a hedging strategy for exporting company AAA a.s. This firm is analyzed from qualitative side as well as from quantitative which is presented in the form of historical overview of the company and its position in international group. Based on this analysis as well as on theoretical findings, the hedging strategy for AAA a.s. was proposed. This strategy uses external and internal means of hedging.
388

Využití teorie extrémních hodnot při řízení operačních rizik / Extreme Value Theory in Operational Risk Management

Vojtěch, Jan January 2009 (has links)
Currently, financial institutions are supposed to analyze and quantify a new type of banking risk, known as operational risk. Financial institutions are exposed to this risk in their everyday activities. The main objective of this work is to construct an acceptable statistical model of capital requirement computation. Such a model must respect specificity of losses arising from operational risk events. The fundamental task is represented by searching for a suitable distribution, which describes the probabilistic behavior of losses arising from this type of risk. There is a strong utilization of the Pickands-Balkema-de Haan theorem used in extreme value theory. Roughly speaking, distribution of a random variable exceeding a given high threshold, converges in distribution to generalized Pareto distribution. The theorem is subsequently used in estimating the high percentile from a simulated distribution. The simulated distribution is considered to be a compound model for the aggregate loss random variable. It is constructed as a combination of frequency distribution for the number of losses random variable and the so-called severity distribution for individual loss random variable. The proposed model is then used to estimate a fi -nal quantile, which represents a searched amount of capital requirement. This capital requirement is constituted as the amount of funds the bank is supposed to retain, in order to make up for the projected lack of funds. There is a given probability the capital charge will be exceeded, which is commonly quite small. Although a combination of some frequency distribution and some severity distribution is the common way to deal with the described problem, the final application is often considered to be problematic. Generally, there are some combinations for severity distribution of two or three, for instance, lognormal distributions with different location and scale parameters. Models like these usually do not have any theoretical background and in particular, the connecting of distribution functions has not been conducted in the proper way. In this work, we will deal with both problems. In addition, there is a derivation of maximum likelihood estimates of lognormal distribution for which hold F_LN(u) = p, where u and p is given. The results achieved can be used in the everyday practices of financial institutions for operational risks quantification. In addition, they can be used for the analysis of a variety of sample data with so-called heavy tails, where standard distributions do not offer any help. As an integral part of this work, a CD with source code of each function used in the model is included. All of these functions were created in statistical programming language, in S-PLUS software. In the fourth annex, there is the complete description of each function and its purpose and general syntax for a possible usage in solving different kinds of problems.
389

[en] RISK ANALYSIS OF NON-LINEAR PORTFOLIOS: AN APPLICATION TO THE OIL AND ENERGY MARKET / [pt] ANÁLISE DE RISCO PARA CARTEIRAS NÃO LINEARES: UMA APLICAÇÃO AO MERCADO DE ENERGIA E PETRÓLEO

JOANA GOMES AZARA DE OLIVEIRA 07 April 2014 (has links)
[pt] Houve um salto de conhecimento na área de derivativos nos anos 70, com destaque para a divulgação das pesquisas de Fisher Black, Myron Scholes e Robert Merton sobre o apreçamento de opções. Desde então, várias pesquisas têm sido realizadas no intuito de encontrar uma métrica de risco adequada às carteiras não lineares, dado que ainda não há um consenso sobre a métrica ideal para estas carteiras, cuja aceitação possa ser comparada à do VaR para carteiras lineares, surgido nos anos 90. Esta pesquisa tem como objetivo comparar a eficiência de algumas métricas de risco na mensuração de risco em carteiras de opções de WTI (West Texas Intermediate). Para tal, calcula-se o valor em risco utilizando diversas metolodogias apresentadas no meio acadêmico e compara-se sua eficácia em relação à à avaliação plena, realizada através do método full Monte Carlo. / [en] In the 70’s, the market saw a big change in the knowledge about derivatives. From this period the researches of Fisher Black, Myron Scholes and Robert Merton on the pricing of options are noteworthy. Since then, many researches have been done aiming to find the ideal metrics for risk assessment of non-linear portfolios, as there is no consensus of an ideal metrics for these portfolios which could be compared to the worldwide acceptance of the 90’s VAR for linear portfolios. This work aims to compare the efficiency of some methodologies for risk assessment in portfolios containing WTI (West Texas Intermediate) options. The risk is calculated using different methodologies presented at academic studies and the result of each of them is compared to the assessment using the Full Monte Carlo method in order to define their efficiency.
390

Comparing Approximations for Risk Measures Related to Sums of Correlated Lognormal Random Variables

Karniychuk, Maryna 30 November 2006 (has links)
In this thesis the performances of different approximations are compared for a standard actuarial and financial problem: the estimation of quantiles and conditional tail expectations of the final value of a series of discrete cash flows. To calculate the risk measures such as quantiles and Conditional Tail Expectations, one needs the distribution function of the final wealth. The final value of a series of discrete payments in the considered model is the sum of dependent lognormal random variables. Unfortunately, its distribution function cannot be determined analytically. Thus usually one has to use time-consuming Monte Carlo simulations. Computational time still remains a serious drawback of Monte Carlo simulations, thus several analytical techniques for approximating the distribution function of final wealth are proposed in the frame of this thesis. These are the widely used moment-matching approximations and innovative comonotonic approximations. Moment-matching methods approximate the unknown distribution function by a given one in such a way that some characteristics (in the present case the first two moments) coincide. The ideas of two well-known approximations are described briefly. Analytical formulas for valuing quantiles and Conditional Tail Expectations are derived for both approximations. Recently, a large group of scientists from Catholic University Leuven in Belgium has derived comonotonic upper and comonotonic lower bounds for sums of dependent lognormal random variables. These bounds are bounds in the terms of "convex order". In order to provide the theoretical background for comonotonic approximations several fundamental ordering concepts such as stochastic dominance, stop-loss and convex order and some important relations between them are introduced. The last two concepts are closely related. Both stochastic orders express which of two random variables is the "less dangerous/more attractive" one. The central idea of comonotonic upper bound approximation is to replace the original sum, presenting final wealth, by a new sum, for which the components have the same marginal distributions as the components in the original sum, but with "more dangerous/less attractive" dependence structure. The upper bound, or saying mathematically, convex largest sum is obtained when the components of the sum are the components of comonotonic random vector. Therefore, fundamental concepts of comonotonicity theory which are important for the derivation of convex bounds are introduced. The most wide-spread examples of comonotonicity which emerge in financial context are described. In addition to the upper bound a lower bound can be derived as well. This provides one with a measure of the reliability of the upper bound. The lower bound approach is based on the technique of conditioning. It is obtained by applying Jensen's inequality for conditional expectations to the original sum of dependent random variables. Two slightly different version of conditioning random variable are considered in the context of this thesis. They give rise to two different approaches which are referred to as comonotonic lower bound and comonotonic "maximal variance" lower bound approaches. Special attention is given to the class of distortion risk measures. It is shown that the quantile risk measure as well as Conditional Tail Expectation (under some additional conditions) belong to this class. It is proved that both risk measures being under consideration are additive for a sum of comonotonic random variables, i.e. quantile and Conditional Tail Expectation for a comonotonic upper and lower bounds can easily be obtained by summing the corresponding risk measures of the marginals involved. A special subclass of distortion risk measures which is referred to as class of concave distortion risk measures is also under consideration. It is shown that quantile risk measure is not a concave distortion risk measure while Conditional Tail Expectation (under some additional conditions) is a concave distortion risk measure. A theoretical justification for the fact that "concave" Conditional Tail Expectation preserves convex order relation between random variables is given. It is shown that this property does not necessarily hold for the quantile risk measure, as it is not a concave risk measure. Finally, the accuracy and efficiency of two moment-matching, comonotonic upper bound, comonotonic lower bound and "maximal variance" lower bound approximations are examined for a wide range of parameters by comparing with the results obtained by Monte Carlo simulation. It is justified by numerical results that, generally, in the current situation lower bound approach outperforms other methods. Moreover, the preservation of convex order relation between the convex bounds for the final wealth by Conditional Tail Expectation is demonstrated by numerical results. It is justified numerically that this property does not necessarily hold true for the quantile.

Page generated in 0.0471 seconds