• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 48
  • 16
  • 6
  • 5
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 89
  • 89
  • 18
  • 16
  • 16
  • 15
  • 15
  • 14
  • 14
  • 14
  • 13
  • 12
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Risk-Oriented Clustering Approach for Asset Categorization and Risk Measurement

Liu, Lu 18 July 2019 (has links)
When faced with market risk for investments and portfolios, people often calculate the risk measure, which is a real number mapping to each random payoff. There are many ways to quantify the potential risk, among which the most important input is the features from future performance. Future distributions are unknown and thus always estimated from historical Profit and Loss (P&L) distributions. However, past data may not be appropriate for estimating the future; risk measures generated from single historical distributions can be subject to error. To overcome these shortcomings, one natural way implemented is to identify and categorize similar assets whose Profit and Loss distributions can be used as alternative scenarios. In practice, one of the most common and intuitive categorizations is sector, based on industry. It is widely agreed that companies in the same sector share the same, or related, business types and operating characteristics. But in the field of risk management, sector-based categorization does not necessarily mean assets are grouped in terms of their risk profiles, and we show that risk measures in the same sector tend to have large variation. Although improved risk measures related to the distribution ambiguity has been discussed at length, we seek to develop a more risk-oriented categorization by providing a new clustering approach. Furthermore, our method can better inform us of the potential risk and the extreme worst-case scenario within the same category.
2

Coherent Distortion Risk Measures in Portfolio Selection

Feng, Ming Bin January 2011 (has links)
The theme of this thesis relates to solving the optimal portfolio selection problems using linear programming. There are two key contributions in this thesis. The first contribution is to generalize the well-known linear optimization framework of Conditional Value-at-Risk (CVaR)-based portfolio selection problems (see Rockafellar and Uryasev (2000, 2002)) to more general risk measure portfolio selection problems. In particular, the class of risk measure under consideration is called the Coherent Distortion Risk Measure (CDRM) and is the intersection of two well-known classes of risk measures in the literature: the Coherent Risk Measure (CRM) and the Distortion Risk Measure (DRM). In addition to CVaR, other risk measures which belong to CDRM include the Wang Transform (WT) measure, Proportional Hazard (PH) transform measure, and lookback (LB) distortion measure. Our generalization implies that the portfolio selection problems can be solved very efficiently using the linear programming approach and over a much wider class of risk measures. The second contribution of the thesis is to establish the equivalences among four formulations of CDRM optimization problems: the return maximization subject to CDRM constraint, the CDRM minimization subject to return constraint, the return-CDRM utility maximization, the CDRM-based Sharpe Ratio maximization. Equivalences among these four formulations are established in a sense that they produce the same efficient frontier when varying the parameters in their corresponding problems. We point out that the first three formulations have already been investigated in Krokhmal et al. (2002) with milder assumptions on risk measures (convex functional of portfolio weights). Here we apply their results to CDRM and establish the fourth equivalence. For every one of these formulations, the relationship between its given parameter and the implied parameters for the other three formulations is explored. Such equivalences and relationships can help verifying consistencies (or inconsistencies) for risk management with different objectives and constraints. They are also helpful for uncovering the implied information of a decision making process or of a given investment market. We conclude the thesis by conducting two case studies to illustrate the methodologies and implementations of our linear optimization approach, to verify the equivalences among four different problem formulations, and to investigate the properties of different members of CDRM. In addition, the efficiency (or inefficiency) of the so-called 1/n portfolio strategy in terms of the trade off between portfolio return and portfolio CDRM. The properties of optimal portfolios and their returns with respect to different CDRM minimization problems are compared through their numerical results.
3

Coherent Distortion Risk Measures in Portfolio Selection

Feng, Ming Bin January 2011 (has links)
The theme of this thesis relates to solving the optimal portfolio selection problems using linear programming. There are two key contributions in this thesis. The first contribution is to generalize the well-known linear optimization framework of Conditional Value-at-Risk (CVaR)-based portfolio selection problems (see Rockafellar and Uryasev (2000, 2002)) to more general risk measure portfolio selection problems. In particular, the class of risk measure under consideration is called the Coherent Distortion Risk Measure (CDRM) and is the intersection of two well-known classes of risk measures in the literature: the Coherent Risk Measure (CRM) and the Distortion Risk Measure (DRM). In addition to CVaR, other risk measures which belong to CDRM include the Wang Transform (WT) measure, Proportional Hazard (PH) transform measure, and lookback (LB) distortion measure. Our generalization implies that the portfolio selection problems can be solved very efficiently using the linear programming approach and over a much wider class of risk measures. The second contribution of the thesis is to establish the equivalences among four formulations of CDRM optimization problems: the return maximization subject to CDRM constraint, the CDRM minimization subject to return constraint, the return-CDRM utility maximization, the CDRM-based Sharpe Ratio maximization. Equivalences among these four formulations are established in a sense that they produce the same efficient frontier when varying the parameters in their corresponding problems. We point out that the first three formulations have already been investigated in Krokhmal et al. (2002) with milder assumptions on risk measures (convex functional of portfolio weights). Here we apply their results to CDRM and establish the fourth equivalence. For every one of these formulations, the relationship between its given parameter and the implied parameters for the other three formulations is explored. Such equivalences and relationships can help verifying consistencies (or inconsistencies) for risk management with different objectives and constraints. They are also helpful for uncovering the implied information of a decision making process or of a given investment market. We conclude the thesis by conducting two case studies to illustrate the methodologies and implementations of our linear optimization approach, to verify the equivalences among four different problem formulations, and to investigate the properties of different members of CDRM. In addition, the efficiency (or inefficiency) of the so-called 1/n portfolio strategy in terms of the trade off between portfolio return and portfolio CDRM. The properties of optimal portfolios and their returns with respect to different CDRM minimization problems are compared through their numerical results.
4

Terrorism and market risk assessment

Lacroix, Jean January 2015 (has links)
Charles University in Prague Faculty of Social Sciences Institute of Economic Studies Bibliographic Record of a an Academic Thesis Title in the language of the thesis (as recorded in SIS) Terrorism and market risk assessment Subtitle Translation of the title into English/Czech (as recorded in SIS) Terrorism and market risk assessment Type of the Thesis Master's thesis Author: Bc. Jean Lacroix Year 2015 Advisor of the thesis Mgr Magdalena Patakova Number of pages 77 Awards Specialization Economics (CFS) Abstract in Czech Abstract in English Terrorist attacks are one of the best examples of fast evolving institutional framework. In that context investors are impacted by a lot of pieces of information in a limited period of time. This disturbs the trading behavior and consequently the distribution of returns on the period following the attack (the information was not predicted and directly affects the investment choices). The present thesis focuses on the risk aspect of such disturbances. If terrorist attacks reshape the distribution of returns, it may modify the risk measures (multivariate and univariate). The particularity of the change in distribution implies that the observed translation into financial measures of risk will not be equal among all indicators. First a distinction exists between univariate...
5

Asymptotics for Risk Measures of Extreme Risks

Yang, Fan 01 July 2013 (has links)
This thesis focuses on measuring extreme risks in insurance business. We mainly use extreme value theory to develop asymptotics for risk measures. We also study the characterization of upper comonotonicity for multiple extreme risks. Firstly, we conduct asymptotics for the Haezendonck--Goovaerts (HG) risk measure of extreme risks at high confidence levels, which serves as an alternative way to statistical simulations. We split the study of this problem into two steps. In the first step, we concentrate on the HG risk measure with a power Young function, which yields certain explicitness. Then we derive asymptotics for a risk variable with a distribution function that belongs to one of the three max-domains of attraction separately. We extend our asymptotic study to the HG risk measure with a general Young function in the second step. We study this problem using different approaches and overcome a lot of technical difficulties. The risk variable is assumed to follow a distribution function that belongs to the max-domain of attraction of the generalized extreme value distribution and we show a unified proof for all three max-domains of attraction. Secondly, we study the first- and second-order asymptotics for the tail distortion risk measure of extreme risks. Similarly as in the first part, we develop the first-order asymptotics for the tail distortion risk measure of a risk variable that follows a distribution function belonging to the max-domain of attraction of the generalized extreme value distribution. In order to improve the accuracy of the first-order asymptotics, we further develop the second-order asymptotics for the tail distortion risk measure. Numerical examples are carried out to show the accuracy of both asymptotics and the great improvements of the second-order asymptotics. Lastly, we characterize the upper comonotonicity via tail convex order. For any given marginal distributions, a maximal random vector with respect to tail convex order is proved to be upper comonotonic under suitable conditions. As an application, we consider the computation of the HG risk measure of the sum of upper comonotonic random variables with exponential marginal distributions. The methodology developed in this thesis is expected to work with the same efficiency for generalized quantiles (such as expectile, Lp-quantiles, ML-quantiles and Orlicz quantiles), quantile based risk measures or risk measures which focus on the tail areas, and also work well on capital allocation problems.
6

Comparative Study Of Risk Measures

Eksi, Zehra 01 August 2005 (has links) (PDF)
There is a little doubt that, for a decade, risk measurement has become one of the most important topics in finance. Indeed, it is natural to observe such a development, since in the last ten years, huge amounts of financial transactions ended with severe losses due to severe convulsions in financial markets. Value at risk, as the most widely used risk measure, fails to quantify the risk of a position accurately in many situations. For this reason a number of consistent risk measures have been introduced in the literature. The main aim of this study is to present and compare coherent, convex, conditional convex and some other risk measures both in theoretical and practical settings.
7

Um estudo sobre funções de dependência e medidas de risco / A study on dependence functions and risk measures.

Gonçalves, Marcelo 28 November 2008 (has links)
Começamos por estudar fronteiras para uma classe especial de medidas de risco quantis, chamadas medidas de risco distorcidas. A hipótese básica é que o conhecimento da estrutura de dependência (ou seja, da distribuição conjunta) da carteira de riscos é incompleta, fazendo com que não seja possível obter um valor exato para tais medidas. Isso é muito comum na prática. Fornecemos duas formas de obter tais limites nessa situação, apresentando seus prós e contras. A modelagem de risco, em um cenário de desconhecimento total ou parcial da distribuição conjunta dos mesmos, geralmente faz uso de cópulas. Entretanto, as cópulas vêm sendo alvo de críticas na literatura recente. Um dos motivos é que as mesmas desprezam o comportamento marginal e comprimem os dados no quadrado unitário. Dentro desse cenário, apresentamos uma função que pode ser vista como uma alternativa e complemento ao uso de cópulas: função de dependência de Sibuya. / We begin our work studying an special class of quantile risk measures, known as distorted risk measures. The basic assumption is that the risk manager does not know the complete dependence structure (that is, the risks\'s joint distribution) embedded in the risk\'s portfolio, what makes the exact computation of the risk measure an impossible task. This is a common scenario in practical problems. We present two approaches to compute bounds for the distorted risk measures in such situation, underlining the pros and cons of each one. In risk modeling, in the absence of complete knowledge regarding their joint distribution, one often relies on the copula function approach. However, copulas have been criticized in recent publications mostly because it ignores the marginal behavior and smash the data into the unity square. In order to overcome such problems we present and alternative and complement to the copula approach: the Sibuya dependence function.
8

Um estudo sobre funções de dependência e medidas de risco / A study on dependence functions and risk measures.

Marcelo Gonçalves 28 November 2008 (has links)
Começamos por estudar fronteiras para uma classe especial de medidas de risco quantis, chamadas medidas de risco distorcidas. A hipótese básica é que o conhecimento da estrutura de dependência (ou seja, da distribuição conjunta) da carteira de riscos é incompleta, fazendo com que não seja possível obter um valor exato para tais medidas. Isso é muito comum na prática. Fornecemos duas formas de obter tais limites nessa situação, apresentando seus prós e contras. A modelagem de risco, em um cenário de desconhecimento total ou parcial da distribuição conjunta dos mesmos, geralmente faz uso de cópulas. Entretanto, as cópulas vêm sendo alvo de críticas na literatura recente. Um dos motivos é que as mesmas desprezam o comportamento marginal e comprimem os dados no quadrado unitário. Dentro desse cenário, apresentamos uma função que pode ser vista como uma alternativa e complemento ao uso de cópulas: função de dependência de Sibuya. / We begin our work studying an special class of quantile risk measures, known as distorted risk measures. The basic assumption is that the risk manager does not know the complete dependence structure (that is, the risks\'s joint distribution) embedded in the risk\'s portfolio, what makes the exact computation of the risk measure an impossible task. This is a common scenario in practical problems. We present two approaches to compute bounds for the distorted risk measures in such situation, underlining the pros and cons of each one. In risk modeling, in the absence of complete knowledge regarding their joint distribution, one often relies on the copula function approach. However, copulas have been criticized in recent publications mostly because it ignores the marginal behavior and smash the data into the unity square. In order to overcome such problems we present and alternative and complement to the copula approach: the Sibuya dependence function.
9

Risk Measures and Dependence Modeling in Financial Risk Management

Eriksson, Kristofer January 2014 (has links)
In financial risk management it is essential to be able to model dependence in markets and portfolios in an accurate and efficient way. A high positive dependence between assets in a portfolio can be devastating, especially in times of crises, since losses will most likely occur at the same time in all assets for such a portfolio. The dependence is therefore directly linked to the risk of the portfolio. The risk can be estimated by several different risk measures, for example Value-at-Risk and Expected shortfall. This paper studies some different ways to measure risk and model dependence, both in a theoretical and empirical way. The main focus is on copulas, which is a way to model and construct complex dependencies. Copulas are a useful tool since it allows the user to separately specify the marginal distributions and then link them together with the copula. However, copulas can be quite complex to understand and it is not trivial to know which copula to use. An implemented copula model might give the user a "black-box" feeling and a severe model risk if the user trusts the model too much and is unaware of what is going. Another model would be to use the linear correlation which is also a way to measure dependence. This is an easier model and as such it is believed to be easier for all users to understand. However, linear correlation is only easy to understand in the case of elliptical distributions, and when we move away from this assumption (which is usually the case in financial data), some clear drawbacks and pitfalls become present. A third model, called historical simulation, uses the historical returns of the portfolio and estimate the risk on this data without making any parametric assumptions about the dependence. The dependence is assumed to be incorporated in the historical evolvement of the portfolio. This model is very easy and very popular, but it is more limited than the previous two models to the assumption that history will repeat itself and needs much more historical observations to yield good results. Here we face the risk that the market dynamics has changed when looking too far back in history. In this paper some different copula models are implemented and compared to the historical simulation approach by estimating risk with Value-at-Risk and Expected shortfall. The parameters of the copulas are also investigated under calm and stressed market periods. This information about the parameters is useful when performing stress tests. The empirical study indicates that it is difficult to distinguish the parameters between the stressed and calm market period. The overall conclusion is; which model to use depends on our beliefs about the future distribution. If we believe that the distribution is elliptical then a correlation model is good, if it is believed to have a complex dependence then the user should turn to a copula model, and if we can assume that history will repeat itself then historical simulation is advantageous.
10

Automated Market Making: Theory and Practice

Othman, Abraham M 15 May 2012 (has links)
Market makers are unique entities in a market ecosystem. Unlike other participants that have exposure (either speculative or endogenous) to potential future states of the world, market making agents either endeavor to secure a risk-free profit or to facilitate trade that would otherwise not occur. In this thesis we present a principled theoretical framework for market making along with applications of that framework to different contexts. We begin by presenting a synthesis of two concepts—automated market making from the artificial intelligence literature and risk measures from the finance literature—that were developed independently. This synthesis implies that the market making agents we develop in this thesis also correspond to better ways of measuring the riskiness of a portfolio—an important application in quantitative finance. We then present the results of the Gates Hillman Prediction Market (GHPM), a fielded large-scale test of automated market making that successfully predicted the opening date of the new computer science buildings at CMU. Ranging over 365 possible opening days, the market’s large event partition required new advances like a novel span-based elicitation interface. The GHPM uncovered some practical flaws of automated market makers; we investigate how to rectify these failures by describing several classes of market makers that are better at facilitating trade in Internet prediction markets. We then shift our focus to notions of profit, and how a market maker can trade to maximize its own account. We explore applying our work to one of the largest and most heavily-traded markets in the world by recasting market making as an algorithmic options trading strategy. Finally, we investigate optimal market makers for fielding wagers when good priors are known, as in sports betting or insurance.

Page generated in 0.0537 seconds