81 |
Dynamic optimal portfolios benchmarking the stock marketGabih, Abdelali, Richter, Matthias, Wunderlich, Ralf 06 October 2005 (has links)
The paper investigates dynamic optimal portfolio strategies of utility maximizing portfolio managers in the presence of risk constraints. Especially we consider
the risk, that the terminal wealth of the portfolio falls short of a certain benchmark level which is proportional to the stock price. This risk is measured by the
Expected Utility Loss. We generalize the findings our previous papers to this case.
Using the Black-Scholes model of a complete financial market and applying martingale methods, analytic expressions for the optimal terminal wealth and the optimal
portfolio strategies are given. Numerical examples illustrate the analytic results.
|
82 |
American options in incomplete markets / upper and lower snell envelopes and robust partial hedgingAguilar, Erick Trevino 25 July 2008 (has links)
In dieser Dissertation werden Amerikanischen Optionen in einem unvollst¨andigen Markt und in stetiger Zeit untersucht. Die Dissertation besteht aus zwei Teilen. Im ersten Teil untersuchen wir ein stochastisches Optimierungsproblem, in dem ein konvexes robustes Verlustfunktional ueber einer Menge von stochastichen Integralen minimiert wird. Dies Problem tritt auf, wenn der Verkaeufer einer Amerikanischen Option sein Ausfallsrisiko kontrollieren will, indem er eine Strategie der partiellen Absicherung benutzt. Hier quantifizieren wir das Ausfallsrisiko durch ein robustes Verlustfunktional, welches durch die Erweiterung der klassischen Theorie des erwarteten Nutzens durch Gilboa und Schmeidler motiviert ist. In einem allgemeinen Semimartingal-Modell beweisen wir die Existenz einer optimalen Strategie. Unter zusaetzlichen Kompaktheitsannahmen zeigen wir, wie das robuste Problem auf ein nicht-robustes Optimierungsproblem bezueglich einer unguenstigsten Wahrscheinlichkeitsverteilung reduziert werden kann. Im zweiten Teil untersuchen wir die obere und die untere Snellsche Einhuellende zu einer Amerikanischen Option. Wir konstruieren diese Einhuellenden fuer eine stabile Familie von aequivalenten Wahrscheinlichkeitsmassen; die Familie der aequivalentenMartingalmassen ist dabei der zentrale Spezialfall. Wir formulieren dann zwei Probleme des robusten optimalen Stoppens. Das Stopp-Problem fuer die obere Snellsche Einhuellende ist durch die Kontrolle des Risikos motiviert, welches sich aus der Wahl einer Ausuebungszeit durch den Kaeufer bezieht, wobei das Risiko durch ein kohaerentes Risikomass bemessen wird. Das Stopp-Problem fuer die untere Snellsche Einhuellende wird durch eine auf Gilboa und Schmeidler zurueckgehende robuste Erweiterung der klassischen Nutzentheorie motiviert. Mithilfe von Martingalmethoden zeigen wir, wie sich optimale Loesungen in stetiger Zeit und fuer einen endlichen Horizont konstruieren lassen. / This thesis studies American options in an incomplete financial market and in continuous time. It is composed of two parts. In the first part we study a stochastic optimization problem in which a robust convex loss functional is minimized in a space of stochastic integrals. This problem arises when the seller of an American option aims to control the shortfall risk by using a partial hedge. We quantify the shortfall risk through a robust loss functional motivated by an extension of classical expected utility theory due to Gilboa and Schmeidler. In a general semimartingale model we prove the existence of an optimal strategy. Under additional compactness assumptions we show how the robust problem can be reduced to a non-robust optimization problem with respect to a worst-case probability measure. In the second part, we study the notions of the upper and the lower Snell envelope associated to an American option. We construct the envelopes for stable families of equivalent probability measures, the family of local martingale measures being an important special case. We then formulate two robust optimal stopping problems. The stopping problem related to the upper Snell envelope is motivated by the problem of monitoring the risk associated to the buyer’s choice of an exercise time, where the risk is specified by a coherent risk measure. The stopping problem related to the lower Snell envelope is motivated by a robust extension of classical expected utility theory due to Gilboa and Schmeidler. Using martingale methods we show how to construct optimal solutions in continuous time and for a finite horizon.
|
83 |
A model of compelled nonuse of informationHouston, Ronald David 05 February 2010 (has links)
The philosophical and empirical study reported here developed from the
observation that information science has had no comprehensive understanding
of nonuse of information. Without such an understanding, information workers
may use the words "nonuse of information" while referring to very different
phenomena. This lack of understanding makes the job of the information
professional difficult. For example, the model presented here reduces hundreds
of theories of information behavior to a conceptually manageable taxonomy of six
conditions that lead to nonuse of information. The six conditions include: 1)
intrinsic somatic conditions, 2) socio-environmental barriers, 3) authoritarian
controls, 4) threshold knowledge shortfall, 5) attention shortfall, and 6) information filtering. This dissertation explains and provides examples of each
condition.
The study of a novel area that had no prior theory or model required a
novel methodology. Thus, for this study, I adopted the pragmatism formulated by
Charles Sanders Peirce, a method of evaluating concepts by their practical
consequences. This pragmatism applied in two ways to the study of nonuse of
information. First, because nonuse of information is a behavior, pragmatism
helped me to limit the psychologic implications of the study to behavior, rather
than to expand the discussion to psychodynamics or cognition, for example. I
justified this limiting on the basis that behavior reflects the use or nonuse of
information, and behavior is more observable than other aspects of psychology,
such as cognition. Second, Peirce's concept of pragmatism supported another of
his contributions to philosophical inquiry, retroduction, sometimes referred to as
abduction. To study nonuse of information through retroduction, I created a fivestep
"definition heuristic," based on the writings of Spradley and McCurdy. I then
created a nine-step "retroduction heuristic" based on the system of logic
identified and termed "retroductive" or "abductive" by Peirce. I used this heuristic
to identify examples of nonuse of information and applied the examples to a
second corpus of research reports that contained examples of compelled nonuse of information. The taxonomy of this study resulted from this second application
and represents a descriptive model of compelled nonuse of information. / text
|
84 |
Value at risk et expected shortfall pour des données faiblement dépendantes : estimations non-paramétriques et théorèmes de convergences / Value at risk and expected shortfall for weak dependent random variables : nonparametric estimations and limit theoremsKabui, Ali 19 September 2012 (has links)
Quantifier et mesurer le risque dans un environnement partiellement ou totalement incertain est probablement l'un des enjeux majeurs de la recherche appliquée en mathématiques financières. Cela concerne l'économie, la finance, mais d'autres domaines comme la santé via les assurances par exemple. L'une des difficultés fondamentales de ce processus de gestion des risques est de modéliser les actifs sous-jacents, puis d'approcher le risque à partir des observations ou des simulations. Comme dans ce domaine, l'aléa ou l'incertitude joue un rôle fondamental dans l'évolution des actifs, le recours aux processus stochastiques et aux méthodes statistiques devient crucial. Dans la pratique l'approche paramétrique est largement utilisée. Elle consiste à choisir le modèle dans une famille paramétrique, de quantifier le risque en fonction des paramètres, et d'estimer le risque en remplaçant les paramètres par leurs estimations. Cette approche présente un risque majeur, celui de mal spécifier le modèle, et donc de sous-estimer ou sur-estimer le risque. Partant de ce constat et dans une perspective de minimiser le risque de modèle, nous avons choisi d'aborder la question de la quantification du risque avec une approche non-paramétrique qui s'applique à des modèles aussi généraux que possible. Nous nous sommes concentrés sur deux mesures de risque largement utilisées dans la pratique et qui sont parfois imposées par les réglementations nationales ou internationales. Il s'agit de la Value at Risk (VaR) qui quantifie le niveau de perte maximum avec un niveau de confiance élevé (95% ou 99%). La seconde mesure est l'Expected Shortfall (ES) qui nous renseigne sur la perte moyenne au delà de la VaR. / To quantify and measure the risk in an environment partially or completely uncertain is probably one of the major issues of the applied research in financial mathematics. That relates to the economy, finance, but many other fields like health via the insurances for example. One of the fundamental difficulties of this process of management of risks is to model the under lying credits, then approach the risk from observations or simulations. As in this field, the risk or uncertainty plays a fundamental role in the evolution of the credits; the recourse to the stochastic processes and with the statistical methods becomes crucial. In practice the parametric approach is largely used.It consists in choosing the model in a parametric family, to quantify the risk according to the parameters, and to estimate its risk by replacing the parameters by their estimates. This approach presents a main risk, that badly to specify the model, and thus to underestimate or over-estimate the risk. Based within and with a view to minimizing the risk model, we choose to tackle the question of the quantification of the risk with a nonparametric approach which applies to models as general as possible. We concentrate to two measures of risk largely used in practice and which are sometimes imposed by the national or international regulations. They are the Value at Risk (VaR) which quantifies the maximum level of loss with a high degree of confidence (95% or 99%). The second measure is the Expected Shortfall (ES) which informs about the average loss beyond the VaR.
|
85 |
Value at risk and expected shortfall : traditional measures and extreme value theory enhancements with a South African market applicationDicks, Anelda 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Accurate estimation of Value at Risk (VaR) and Expected Shortfall (ES) is critical in the management of extreme market risks. These risks occur with small probability, but the financial impacts could be large.
Traditional models to estimate VaR and ES are investigated. Following usual practice, 99% 10 day VaR and ES measures are calculated. A comprehensive theoretical background is first provided and then the models are applied to the Africa Financials Index from 29/01/1996 to 30/04/2013. The models considered include independent, identically distributed (i.i.d.) models and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) stochastic volatility models. Extreme Value Theory (EVT) models that focus especially on extreme market returns are also investigated. For this, the Peaks Over Threshold (POT) approach to EVT is followed. For the calculation of VaR, various scaling methods from one day to ten days are considered and their performance evaluated.
The GARCH models fail to converge during periods of extreme returns. During these periods, EVT forecast results may be used. As a novel approach, this study considers the augmentation of the GARCH models with EVT forecasts. The two-step procedure of pre-filtering with a GARCH model and then applying EVT, as suggested by McNeil (1999), is also investigated.
This study identifies some of the practical issues in model fitting. It is shown that no single forecasting model is universally optimal and the choice will depend on the nature of the data. For this data series, the best approach was to augment the GARCH stochastic volatility models with EVT forecasts during periods where the first do not converge. Model performance is judged by the actual number of VaR and ES violations compared to the expected number. The expected number is taken as the number of return observations over the entire sample period, multiplied by 0.01 for 99% VaR and ES calculations. / AFRIKAANSE OPSOMMING: Akkurate beraming van Waarde op Risiko (Value at Risk) en Verwagte Tekort (Expected Shortfall) is krities vir die bestuur van ekstreme mark risiko’s. Hierdie risiko’s kom met klein waarskynlikheid voor, maar die finansiële impakte is potensieel groot.
Tradisionele modelle om Waarde op Risiko en Verwagte Tekort te beraam, word ondersoek. In ooreenstemming met die algemene praktyk, word 99% 10 dag maatstawwe bereken. ‘n Omvattende teoretiese agtergrond word eers gegee en daarna word die modelle toegepas op die Africa Financials Index vanaf 29/01/1996 tot 30/04/2013. Die modelle wat oorweeg word sluit onafhanklike, identies verdeelde modelle en Veralgemeende Auto-regressiewe Voorwaardelike Heteroskedastiese (GARCH) stogastiese volatiliteitsmodelle in. Ekstreemwaarde Teorie modelle, wat spesifiek op ekstreme mark opbrengste fokus, word ook ondersoek. In hierdie verband word die Peaks Over Threshold (POT) benadering tot Ekstreemwaarde Teorie gevolg. Vir die berekening van Waarde op Risiko word verskillende skaleringsmetodes van een dag na tien dae oorweeg en die prestasie van elk word ge-evalueer.
Die GARCH modelle konvergeer nie gedurende tydperke van ekstreme opbrengste nie. Gedurende hierdie tydperke, kan Ekstreemwaarde Teorie modelle gebruik word. As ‘n nuwe benadering oorweeg hierdie studie die aanvulling van die GARCH modelle met Ekstreemwaarde Teorie vooruitskattings. Die sogenaamde twee-stap prosedure wat voor-af filtrering met ‘n GARCH model behels, gevolg deur die toepassing van Ekstreemwaarde Teorie (soos voorgestel deur McNeil, 1999), word ook ondersoek.
Hierdie studie identifiseer sommige van die praktiese probleme in model passing. Daar word gewys dat geen enkele vooruistkattingsmodel universeel optimaal is nie en die keuse van die model hang af van die aard van die data. Die beste benadering vir die data reeks wat in hierdie studie gebruik word, was om die GARCH stogastiese volatiliteitsmodelle met Ekstreemwaarde Teorie vooruitskattings aan te vul waar die voorafgenoemde nie konvergeer nie. Die prestasie van die modelle word beoordeel deur die werklike aantal Waarde op Risiko en Verwagte Tekort oortredings met die verwagte aantal te vergelyk. Die verwagte aantal word geneem as die aantal obrengste waargeneem oor die hele steekproefperiode, vermenigvuldig met 0.01 vir die 99% Waarde op Risiko en Verwagte Tekort berekeninge.
|
86 |
Odhad rizika v měsíčním horizontu na základě dvouleté časové řady / Estimations of risk with respect to monthly horizon based on the two-year time seriesMyšičková, Ivana January 2014 (has links)
The thesis describes commonly used measures of risk, such as volatility, Value at Risk (VaR) and Expected Shortfall (ES), and is tasked with creating models for measuring market risk. It is concerned with the risk over daily and over monthly horizons and shows the shortcomings of a square-root-of-time approach for converting VaR and ES between horizons. Parametric models, geometric Brownian motion (GBM) and GARCH process, and non-parametric models, historical simulation (HS) and some its possible improvements, are presented. The application of these mentioned models is demonstrated using real data. The accuracy of VaR models is proved through backtesting and the results are discussed. Part of this thesis is also a simulation study, which reveals the precision of VaR and ES estimates.
|
87 |
Použití koherentních metod měření rizika v modelování operačních rizik / The use of coherent risk measures in operational risk modelingLebovič, Michal January 2012 (has links)
The debate on quantitative operational risk modeling has only started at the beginning of the last decade and the best-practices are still far from being established. Estimation of capital requirements for operational risk under Advanced Measurement Approaches of Basel II is critically dependent on the choice of risk measure, which quantifies the risk exposure based on the underlying simulated distribution of losses. Despite its well-known caveats Value-at-Risk remains a predominant risk measure used in the context of operational risk management. We describe several serious drawbacks of Value-at-Risk and explain why it can possibly lead to misleading conclusions. As a remedy we suggest the use of coherent risk measures - and namely the statistic known as Expected Shortfall - as a suitable alternative or complement for quantification of operational risk exposure. We demonstrate that application of Expected Shortfall in operational loss modeling is feasible and produces reasonable and consistent results. We also consider a variety of statistical techniques for modeling of underlying loss distribution and evaluate extreme value theory framework as the most suitable for this purpose. Using stress tests we further compare the robustness and consistency of selected models and their implied risk capital estimates...
|
88 |
限制下方風險的資產配置 / Controlling Downside Risk in Asset Allocation簡佳至, Chien, Chia-Chih Unknown Date (has links)
由於許多資產報酬率的分配呈現厚尾的現象,因此,本文探討將最低報酬要求限制條件加入傳統的平均數╱變異數模型中,考慮在分配已知的情形下,假設資產報酬率的分配為t分配及常態分配,來求取最適的資產配置;在分配未知的情形下,利用古典Bootstrap法、移動區塊Bootstrap法及定態Bootstrap法的抽樣方法來模擬資產報酬率的分配形式,並利用模擬的資產報酬率分配求出最適的資產配置。
同時,本文亦探討資產配置在風險管理上的運用,當分配已知時,若對分配參數的估計正確,則使用的最低要求報酬率就是此資產配置的涉險值,反之,若對參數的估計錯誤時,會對資產配置產生很大的影響及風險管理上的不正確;當分配未知時,利用模擬方法來產生分配,則使用的最低要求報酬率可看成是此資產配置的涉險值。
實證部分選取資料分成本國及全球,研究發現對於何種分配或模擬方法的資產配置績效最好?沒有一定的結論。其原因是各種分配或模擬方法皆必須視資料的性質而定,因此,本論文的貢獻僅在建議使用厚尾分配及利用模擬方法,來符合資產報酬率呈現厚尾的現象,並利用此分配,以期在考慮最低報酬要求限制條件下的資產配置更為精確。 / The distributions of many asset returns tend to be fat-tail. This paper attempts to add the shortfall constraint in Mean-Variance Analysis. When the distribution is known, we find the optimal asset allocation under student-t distribution and normal distribution. On the other hand, we use Classical Bootstrap, Moving Block Bootstrap, and Stationary Bootstrap to stimulate the distribution of asset return, and to obtain the optimal asset allocation.
We also examine the risk management of asset allocation. When we use the correct estimators of parameters under the known distribution, the threshold in shortfall constraint is the value-at-risk in asset allocation. Otherwise, if using the wrong estimators, we get the incorrect asset allocation and the improper risk management. When the distribution is unknown, using simulation to generate the distribution, the value-at-risk is the threshold.
The empirical study is conducted in two parts, domestic and global asset allocation. The results cannot point out which distributions and simulations are suitable. They depend on the data’s property. The contribution of this paper is to introduce some methods to fit the fat-tail behavior of asset return in asset allocation.
|
89 |
Méthodes analytiques pour le Risque des Portefeuilles FinanciersSADEFO KAMDEM, Jules 15 December 2004 (has links) (PDF)
Dans cette thèse, on propose des méthodes analytiques ou numériques pour l'estimation de la VaR ou l'Expected Shortfall des portefeuilles linéaires, quadratiques, lorsque le vecteur des facteurs de risques suit un mélange convexe de distributions elliptiques. Aussi, on introduit pour la prémière fois la notion de "portefeuille quadratique" d'actifs de bases (ie. actions).
|
90 |
How useful are intraday data in Risk Management? : An application of high frequency stock returns of three Nordic Banks to the VaR and ES calculationSomnicki, Emil, Ostrowski, Krzysztof January 2010 (has links)
<p>The work is focused on the Value at Risk and the Expected Shortfallcalculation. We assume the returns to be based on two pillars - the white noise and the stochastic volatility. We assume that the white noise follows the NIG distribution and the volatility is modeled using the nGARCH, NIG-GARCH, tGARCH and the non-parametric method. We apply the models into the stocks of three Banks of the Nordic market. We consider the daily and the intraday returns with the frequencies 5, 10, 20 and 30 minutes. We calculate the one step ahead VaR and ES for the daily and the intraday data. We use the Kupiec test and the Markov test to assess the correctness of the models. We also provide a new concept of improving the daily VaR calculation by using the high frequency returns. The results show that the intraday data can be used to the one step ahead VaR and the ES calculation. The comparison of the VaR for the end of the following trading day calculated on the basis of the daily returns and the one computed using the high frequency returns shows that using the intraday data can improve the VaR outcomes.</p>
|
Page generated in 0.0331 seconds