61 |
Metody stochastického programováni pro investiční rozhodování / Stochastic Programming Methods for Investment DecisionsKubelka, Lukáš January 2014 (has links)
This thesis deals with methods of stochastic programming and their application in financial investment. Theoretical part is devoted to basic terms of mathematical optimization, stochastic programming and decision making under uncertainty. Furter, there are introduced basic principles of modern portfolio theory, substantial part is devoted to risk measurement techniques in the context of investment, mostly to the methods Value at Risk and Expected shortfall. Practical part aims to creation of optimization models with an emphasis to minimize investment risk. Created models deal with real data and they are solved in optimization software GAMS.
|
62 |
Regularly Varying Time Series with Long Memory: Probabilistic Properties and EstimationBilayi-Biakana, Clémonell Lord Baronat 17 January 2020 (has links)
We consider tail empirical processes for long memory stochastic volatility models with
heavy tails and leverage. We show a dichotomous behaviour for the tail empirical process with fixed levels, according to the interplay between the long memory parameter and the tail index; leverage does not play a role. On the other hand, the tail empirical process with random levels is not affected by either long memory or leverage. The tail empirical process with random levels is used to construct a family of estimators of the tail index, including the famous Hill estimator and harmonic moment estimators. The limiting behaviour of these estimators is not affected by either long memory or leverage. Furthermore, we consider estimators of risk measures such as Value-at-Risk and Expected Shortfall. In these cases, the limiting behaviour is affected by long memory, but it is not affected by leverage. The theoretical results are illustrated by simulation studies.
|
63 |
Optimal portfolios with bounded shortfall risksGabih, Abdelali, Wunderlich, Ralf 26 August 2004 (has links)
This paper considers dynamic optimal portfolio strategies of utility maximizing
investors in the presence of risk constraints. In particular, we investigate the optimization problem with an additional constraint modeling bounded shortfall risk
measured by Value at Risk or Expected Loss. Using the Black-Scholes model of a
complete financial market and applying martingale methods we give analytic expressions for the optimal terminal wealth and the optimal portfolio strategies and
present some numerical results.
|
64 |
Four Essays on Risk Assessment with Financial Econometrics ModelsCastillo, Brenda 25 July 2022 (has links)
This thesis includes four essays on risk assessment with financial econometrics models. The first chapter provides Monte Carlo evidence on the efficiency gains obtained in GARCH-base estimations of VaR and ES by incorporating dependence information through copulas and subsequently using full maximum likelihood (FML) estimates. First, individual returns series are considered; in this case, the efficiency gain stems from exploiting the relationship with another returns series using a copula model. Second, portfolio returns series obtained as a linear combination of returns series related with a copula model, are considered; in this case, the efficiency gain stems from using FML estimates instead of two-stage maximum likelihood estimates. Our results show that, in these situations, using copula models and FML leads to a substantial reduction in the mean squared error of the VaR and ES estimates (around 50\% when there is a medium degree of dependence between returns) and a notable improvement in the performance of backtesting procedures. Then, chapter 2 analyzes the impact of the COVID-19 pandemic on the conditional variance of stock returns. In this work, we look at this effect from a global perspective, employing series of major stock market and sector indices. We use the Hansen’s Skewed-t distribution with EGARCH extended to control for sudden changes in volatility. We oversee the COVID-19 effect on the VaR. Our results show that there is a significant sudden shift up in the return distribution variance post the announcement of the pandemic, which must be explained properly to obtain reliable measures for financial risk management. In chapter 3, we assess VaR and ES estimates assuming different models for standardised returns such as Cornish-Fisher and Gram-Charlier polynomial expansions, and well-known parametric densities such as normal, skewed Student-t family of Zhu and Galbraith (2010), and Johnson. This paper aims to check whether models based on polynomial expansions outperform the parametric ones. We carry out the model performance comparison in two stages. First, a backtesting analysis for VaR and ES, and second, using the loss function approach. Our backtesting results in our empirical exercise suggest that all distributions, but the normal, perform quite well in VaR and ES estimations. Regarding the loss function analysis, we conclude that the Cornish-Fisher expansion usually outperforms the others in VaR estimation, but Johnson distribution is the one that provides the best ES estimates in most cases. Although the differences among all distributions (excluding the normal) are not great. Finally, chapter 4 assess whether accounting for asymmetry and tail-dependence in returns distributions may help to identify more profitable investment strategies in asset portfolios. Three copula models are used to parameterize the multivariate distribution of returns: Gaussian, C-Vine and R-Vine copulas. Using data from equities and ETFs from the US market, we find evidence that, for portfolios of 48 constituents or less, the R-Vine copula is able to produce more profitable portfolios with respect to both, the C-Vine and Gaussian copulas. However, for portfolios of 100 assets, performance of R- and C-Vine copulas is quite similar, being both better than the Gaussian copula.
|
65 |
Non-parametricbacktesting of expected shortfall / Icke-parametrisk backtesting av expected shortfallEdberg, Patrik, Käck, Benjamin January 2017 (has links)
Since the Basel Committee on Banking Supervision first suggested a transition to Expected Shortfall as the primary risk measure for financial institutions, the question on how to backtest it has been widely discussed. Still, there is a lack of studies that compare the different proposed backtesting methods. This thesis uses simulations and empirical data to evaluate the performance of non-parametric backtests under different circumstances. An important takeaway from the thesis is that the different backtests all use some kind of trade-off between measuring the number of Value at Risk exceedances and their magnitudes. The main finding of this thesis is a list, ranking the non-parametric backtests. This list can be used to choose backtesting method by cross-referencing to what is possible to implement given the estimation method that the financial institution uses. / Sedan Baselkommittén föreslog införandet av Expected Shortfall som primärt riskmått för finansiella institutioner, har det debatteras vilken backtesting metod som är bäst. Trots detta råder det brist på studier som utvärderar olika föreslagna backtest. I studien används simuleringar och historisk data för att utvärdera icke-parametriska backtests förmåga att under olika omständigheter upptäcka underskattad Expected Shortfall. En viktig iakttagelse är att alla de undersökta testen innebär ett avvägande i vilken utsträckning det skall detektera antalet och/eller storleken på Value at Risk överträdelserna. Studien resulterar i en prioriterad lista över vilka icke-parametriska backtest som är bäst. Denna lista kan sedan användas för att välja backtest utefter vad varje finansiell institution anser är möjligt givet dess estimeringsmetod.
|
66 |
Value at risk et expected shortfall pour des données faiblement dépendantes : estimations non-paramétriques et théorèmes de convergencesKabui, Ali 19 September 2012 (has links) (PDF)
Quantifier et mesurer le risque dans un environnement partiellement ou totalement incertain est probablement l'un des enjeux majeurs de la recherche appliquée en mathématiques financières. Cela concerne l'économie, la finance, mais d'autres domaines comme la santé via les assurances par exemple. L'une des difficultés fondamentales de ce processus de gestion des risques est de modéliser les actifs sous-jacents, puis d'approcher le risque à partir des observations ou des simulations. Comme dans ce domaine, l'aléa ou l'incertitude joue un rôle fondamental dans l'évolution des actifs, le recours aux processus stochastiques et aux méthodes statistiques devient crucial. Dans la pratique l'approche paramétrique est largement utilisée. Elle consiste à choisir le modèle dans une famille paramétrique, de quantifier le risque en fonction des paramètres, et d'estimer le risque en remplaçant les paramètres par leurs estimations. Cette approche présente un risque majeur, celui de mal spécifier le modèle, et donc de sous-estimer ou sur-estimer le risque. Partant de ce constat et dans une perspective de minimiser le risque de modèle, nous avons choisi d'aborder la question de la quantification du risque avec une approche non-paramétrique qui s'applique à des modèles aussi généraux que possible. Nous nous sommes concentrés sur deux mesures de risque largement utilisées dans la pratique et qui sont parfois imposées par les réglementations nationales ou internationales. Il s'agit de la Value at Risk (VaR) qui quantifie le niveau de perte maximum avec un niveau de confiance élevé (95% ou 99%). La seconde mesure est l'Expected Shortfall (ES) qui nous renseigne sur la perte moyenne au delà de la VaR.
|
67 |
Credit Default Swaps as Hedging Instruments Against Banks' Stock Price Fluctuations Before and During Financial Crisis / Kredito rizikos apsikeitimo sandoriai – finansinė priemonė apsidrausti nuo bankų akcijų kainų svyravimų per ir prieš kriziniu laikotarpiuVolosenkina, Viktorija 23 June 2010 (has links)
In this paper dependence between credit default swap (CDS) values and stock price movements of the largest European banking groups is examined and effectiveness of the usage of CDS contracts as a tool to hedge exposure to the price movements of the underlying stock during the pre-crisis and crisis periods is assessed. The effectiveness is evaluated by comparing estimated Value-at-Risk (VaR) and Expected Shortfall (ES) risk measures of portfolios consisting of stocks and CDS vis-à-vis portfolios consisting of only stocks. CDS are valued using mark-to-market approach. Marginal distributions of CDS value changes and stock returns are estimated using Kernel density estimate from historical time-series data of daily stock returns and CDS value changes. Dependence between marginal distributions is estimated using Gaussian, Gumbel and Student‟s t copulas. Random portfolio values are simulated using Monte Carlo Simulation from estimated copulas parameters and marginal distributions for daily, quarterly and yearly time horizons. VaR and ES with 90%, 95% and 99% confidence level are estimated from the simulated portfolio return distribution. The results show that there is a significant negative dependence between CDS values and stock prices during financial crisis while dependence is weak in the pre-crisis period. The main finding of the paper is that CDS added into the portfolio of stocks significantly reduces VaR and ES of a portfolio during the period of financial crisis while they... [to full text] / Šiame darbe tikrinama didţiausių Europos bankų grupių kredito rizikos apsikeitimo sandorių (CDS) ir akcijų kainų priklausomybė bei vertinamas CDS efektyvumas, jei jais draudţiamasi nuo akcijų kainų svyravimų prieš kriziniu ir kriziniu laikotarpiu. Efektyvumas yra įvertinamas lyginant apskaičiuotas rizikos vertes (VaR) ir tikėtinus vertės trūkumus (ES) dviejų portfelių: akcijų portfelio bei akcijų ir CDS portfelio. CDS vertinti yra naudojamas pagal rinką vertinimo būdas (mark-to-market approach). CDS verčių pasikeitimo ir akcijų grąţos ribiniai pasiskirstymai yra įvertinami, naudojant Kernel įvertinimą (Kernel Estimator) iš istorinių akcijų grąţų ir CDS verčių pokyčių duomenų. Priklausomybė tarp ribinių pasiskirstymų yra įvertinama naudojant Gauso, Gumbelio ir Studento t kopulas (copulas). Atsitiktinės portfelių vertės yra susimuliuojamos naudojant Monte Carlo simuliaciją, pritaikant kopulų parametrus bei kintamųjų ribinius pasiskirstymus vienos dienos, ketvirčio bei metų periodams. VaR ir ES su 90%, 95% ir 99% pasitikėjimo intervalais yra skaičiuojami iš susimuliuotų portfelio grąţų pasiskirstymo. Gauti rezultatai rodo, kad tarp akcijų kainų ir CDS verčių yra stipri priklausomybė krizės laikotarpiu, tuo tarpu prieš kriziniu laikotarpiu priklausomybė yra silpna. Pagrindinė darbo išvada yra ta, jog CDS įtraukti į akcijų portfelį reikšmingai sumaţina portfelio VaR ir ES kriziniu laikotarpiu, tačiau nesumaţina prieš kriziniu laikotarpiu. Portfelio rizika gali būti sumaţinta, jei... [toliau žr. visą tekstą]
|
68 |
American options in incomplete marketsAguilar, Erick Trevino 25 July 2008 (has links)
In dieser Dissertation werden Amerikanischen Optionen in einem unvollst¨andigen Markt und in stetiger Zeit untersucht. Die Dissertation besteht aus zwei Teilen. Im ersten Teil untersuchen wir ein stochastisches Optimierungsproblem, in dem ein konvexes robustes Verlustfunktional ueber einer Menge von stochastichen Integralen minimiert wird. Dies Problem tritt auf, wenn der Verkaeufer einer Amerikanischen Option sein Ausfallsrisiko kontrollieren will, indem er eine Strategie der partiellen Absicherung benutzt. Hier quantifizieren wir das Ausfallsrisiko durch ein robustes Verlustfunktional, welches durch die Erweiterung der klassischen Theorie des erwarteten Nutzens durch Gilboa und Schmeidler motiviert ist. In einem allgemeinen Semimartingal-Modell beweisen wir die Existenz einer optimalen Strategie. Unter zusaetzlichen Kompaktheitsannahmen zeigen wir, wie das robuste Problem auf ein nicht-robustes Optimierungsproblem bezueglich einer unguenstigsten Wahrscheinlichkeitsverteilung reduziert werden kann. Im zweiten Teil untersuchen wir die obere und die untere Snellsche Einhuellende zu einer Amerikanischen Option. Wir konstruieren diese Einhuellenden fuer eine stabile Familie von aequivalenten Wahrscheinlichkeitsmassen; die Familie der aequivalentenMartingalmassen ist dabei der zentrale Spezialfall. Wir formulieren dann zwei Probleme des robusten optimalen Stoppens. Das Stopp-Problem fuer die obere Snellsche Einhuellende ist durch die Kontrolle des Risikos motiviert, welches sich aus der Wahl einer Ausuebungszeit durch den Kaeufer bezieht, wobei das Risiko durch ein kohaerentes Risikomass bemessen wird. Das Stopp-Problem fuer die untere Snellsche Einhuellende wird durch eine auf Gilboa und Schmeidler zurueckgehende robuste Erweiterung der klassischen Nutzentheorie motiviert. Mithilfe von Martingalmethoden zeigen wir, wie sich optimale Loesungen in stetiger Zeit und fuer einen endlichen Horizont konstruieren lassen. / This thesis studies American options in an incomplete financial market and in continuous time. It is composed of two parts. In the first part we study a stochastic optimization problem in which a robust convex loss functional is minimized in a space of stochastic integrals. This problem arises when the seller of an American option aims to control the shortfall risk by using a partial hedge. We quantify the shortfall risk through a robust loss functional motivated by an extension of classical expected utility theory due to Gilboa and Schmeidler. In a general semimartingale model we prove the existence of an optimal strategy. Under additional compactness assumptions we show how the robust problem can be reduced to a non-robust optimization problem with respect to a worst-case probability measure. In the second part, we study the notions of the upper and the lower Snell envelope associated to an American option. We construct the envelopes for stable families of equivalent probability measures, the family of local martingale measures being an important special case. We then formulate two robust optimal stopping problems. The stopping problem related to the upper Snell envelope is motivated by the problem of monitoring the risk associated to the buyer’s choice of an exercise time, where the risk is specified by a coherent risk measure. The stopping problem related to the lower Snell envelope is motivated by a robust extension of classical expected utility theory due to Gilboa and Schmeidler. Using martingale methods we show how to construct optimal solutions in continuous time and for a finite horizon.
|
69 |
Four essays in financial econometrics / Quatre Essais sur l’Econométrie FinancièreBanulescu, Denisa-Georgiana 05 November 2014 (has links)
Cette thèse se concentre sur des mesures du risque financier et la modélisation de la volatilité. L’objectifgénéral est : (i) de proposer de nouvelles techniques pour mesurer à la fois le risque systémique et lerisque à haute fréquence, et (ii) d’appliquer et d’améliorer les outils économétriques de modélisation etde prévision de la volatilité. Ce travail comporte quatre chapitres (papiers de recherche).La première partie de la thèse traite des questions liées à la modélisation et la prévision des mesuresdu risque à haute fréquence et du risque systémique. Plus précisément, le deuxième chapitre proposeune nouvelle mesure du risque systémique utilisée pour identifier les institutions financières d’importancesystémique (SIFIs). Basée sur une approche spécifique, cette mesure originale permet de décomposer lerisque global du système financier tout en tenant compte des caractéristiques de l’entreprise. Le troisièmechapitre propose des mesures du risque de marché intra-journalier dans le contexte particulier des donnéesà haute fréquence irrégulièrement espacées dans le temps (tick-by-tick).La deuxième partie de la thèse est consacrée aux méthodes d’estimation et de prévision de la volatilitéincluant directement des données à haute fréquence ou des mesures réalisées de volatilité. Ainsi, dans lequatrième chapitre, nous cherchons à déterminer, dans le contexte des modèles de mélange des fréquencesd’échantillonnage (MIDAS), si des regresseurs à haute fréquence améliorent les prévisions de la volatilitéà basse fréquence. Une question liée est de savoir s’il existe une fréquence d’échantillonnage optimaleen termes de prévision, et non de mesure de la volatilité. Le cinquième chapitre propose une versionrobuste aux jumps du modèle Realized GARCH. L’application porte sur la crise / This thesis focuses on financial risk measures and volatility modeling. The broad goal of this dissertationis: (i) to propose new techniques to measure both systemic risk and high-frequency risk, and (ii) toapply and improve advanced econometric tools to model and forecast time-varying volatility. This workhas been concretized in four chapters (articles).The first part addresses issues related to econometric modeling and forecasting procedures on bothsystemic risk and high-frequency risk measures. More precisely, Chapter 2 proposes a new systemic riskmeasure used to identify systemically important financial institutions (SIFIs). Based on a componentapproach, this original measure allows to decompose the risk of the aggregate financial system whileaccounting for the firm characteristics. Chapter 3 studies the importance and certifies the validity ofintraday High Frequency Risk (HFR) measures for market risk in the special context of irregularly spacedhigh-frequency data.The second part of this thesis tackles the need to improve the estimation/prediction of volatility bydirectly including high-frequency data or realized measures of volatility. Therefore, in Chapter 4 weexamine whether high-frequency data improve the volatility forecasts accuracy, and if so, whether thereexists an optimal sampling frequency in terms of prediction. Chapter 5 studies the financial volatilityduring the global financial crisis. To this aim, we use the largest volatility shocks, as provided by therobust version of the Realized GARCH model, to identify and analyze the events having induced theseshocks during the crisis.
|
70 |
Cornish-Fisher Expansion and Value-at-Risk method in application to risk management of large portfoliosSjöstrand, Maria, Aktaş, Özlem January 2011 (has links)
One of the major problem faced by banks is how to manage the risk exposure in large portfolios. According to Basel II regulation banks has to measure the risk using Value-at-Risk with confidence level 99%. However, this regulation does not specify the way to calculate Valueat- Risk. The easiest way to calculate Value-at-Risk is to assume that portfolio returns are normally distributed. Altough, this is the most common way to calculate Value-at-Risk, there exists also other methods. The previous crisis shows that the regular methods are unfortunately not always enough to prevent bankruptcy. This paper is devoted to compare the classical methods of estimating risk with other methods such as Cornish-Fisher Expansion (CFVaR) and assuming generalized hyperbolic distribution. To be able to do this study, we estimate the risk in a large portfolio consisting of ten stocks. These stocks are chosen from the NASDAQ 100-list in order to have highly liquid stocks (bluechips). The stocks are chosen from different sectors to make the portfolio welldiversified. To investigate the impact of dependence between the stocks in the portfolio we remove the two most correlated stocks and consider the resulting eight stock portfolio as well. In both portfolios we put equal weight to the included stocks. The results show that for a well-diversified large portfolio none of the risk measures are violated. However, for a portfolio consisting of only one highly volatile stock we prove that we have a violation in the classical methods but not when we use the modern methods mentioned above.
|
Page generated in 0.0522 seconds