• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 183
  • 109
  • 40
  • 29
  • 23
  • 18
  • 18
  • 13
  • 11
  • 10
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 483
  • 483
  • 483
  • 87
  • 85
  • 75
  • 74
  • 67
  • 66
  • 64
  • 61
  • 59
  • 55
  • 55
  • 48
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Distributional Dynamics of Fama-French Factors in European Markets / Tidsvarierande fördelningar för Fama-French-faktorer på europeiska marknader

Löfgren, Wilmer January 2020 (has links)
The three-factor model of Fama and French has proved to be a seminal contribution to asset pricing theory, and was recently extended to include two more factors, yielding the Fama-French five-factor model. Other proposed augmentations of the three-factor model includes the introduction of a momentum factor by Carthart. The extensive use of such factors in asset pricing theory and investing motivates the study of the distributional properties of the returns of these factors. However, previous studies have focused on subsets of these six factors on the U.S. market. In this thesis, the distributional properties of daily log-returns of the five Fama-French factors and the Carthart momentum factor in European data from 2009 to 2019 are examined. The univariate distributional dynamics of the factor log-returns are modelled as ARMA-NGARCH processes with skewed t distributed driving noise sequences. The Gaussian and t copula are then used to model the joint distributions of these factor log-returns. The models developed are applied to estimate the one-day ahead Value-at-Risk (VaR) in testing data. The estimations of the VaR are backtested to check for correct unconditional coverage and exponentially distributed durations between exceedances. The results suggest that the ARMA-NGARCH processes are a valid approximation of the factor log-returns, and lead to good estimations of the VaR. The results of the multivariate analysis suggest that constant Gaussian and t copulas might be insufficient to model the dependence structure of the factors, and that there might be a need for more flexible copula models with dynamic correlations between factor log-returns. / Fama och Frenchs trefaktormodell har blivit en populär modell för aktieavkastning, och utvidgades nyligen av Fama och French genom att två ytterligare faktorer lades till för att skapa en femfaktormodell. Carthart föreslår en annan modell där trefaktormodellen kompletteras med en momentumfaktor. Då dessa faktorer används inom både akademiska sammanhang och kapitalförvaltning finns det ett tydligt behov av att undersöka vilka egenskaper fördelningen av faktorernas avkastning har. Dock har tidigare sådan forskning inte undersökt detta för alla sex faktorer, och endast använt data från USA:s marknad. I detta examensarbete undersökt därför sannolikhetsfördelningen för den logaritmiska dagliga avkastningen av de fem Fama-French-faktorerna och Cartharts momentumfaktor i europeisk data från åren 2009 till 2019. De endimensionella sannolikhetsfördelningarna modelleras som dynamiska med hjälp av ARMA-NGARCH-processer med feltermer som är fördelade enligt en generaliserad t-fördelning som tillåter skevhet. För att modellera multivariata fördelningar används en Gaussisk copula och en t-copula. De erhållna modellerna används sedan för att uppskatta daglig Value-at-Risk (VaR) i testdata. Dessa uppskattningar av VaR genomgår sedan statistiska test för att undersöka om antalet överträdelser är korrekt och tiderna mellan varje överträdelse är exponentialfördelade. Resultaten i detta examensarbete tyder på att ARMA-NGARCH-processer är en bra approximation av faktorernas logaritmiska dagliga avkastning, och ger bra uppskattningar av VaR. Resultaten för den multivariata analysen tyder på att en konstant copula kan vara en otillräcklig modell för beroendestrukturen mellan faktorerna, och att det möjligen finns ett behov av att använda mer flexibla copula-modeller med en dynamisk korrelation mellan faktorernas logaritmiska avkastning.
122

Stochastic Modelling of Cash Flows in Private Equity / Stokastisk modellering av kassaflöden i private equity

Ungsgård, Oscar January 2020 (has links)
An investment in a private equity is any investment made in a financial asset that is not publicly traded. As such these assets are very difficult to value and also give rise to great difficulty when it comes to quantifying risk. In a typical private equity investment the investor commits a prespecified amount of capital to a fund, this capital will be called upon as needed by the fund and eventually capital will be returned to the investor by the fund as it starts to turn a profit. In this way a private equity investment can be boiled down to consist of two cash flows, the contributions to the fund and distributions from the fund to the investor. These cash flows are usually made within a prespecified time frame but at unspecified intervals and amounts. As an investor in a fund, carrying too little liquid assets when contributions are called upon will cause trouble, but carrying significantly more than needed is also not desirable as it represents a loss in potential revenue from having less capital in more profitable investments. The goal of this thesis was to attempt to find a way to reliably model these cash flows and to find a way to represent the results in a meaningful way for the benefit of the investor by constructing value at risk like risk measures for the necessary liquid capital to carry at a given time in case contributions are called upon. It was found that the distributions could be modelled very well with the chosen stochastic processes, both as it related to predicting the average path of the cash flows and as it relates to modelling the variability of them. Contrary to this it was found that the contributions could not be modelled very well. The reason for this was found to be an observed lag in the speed of contributions at the start of the funds lifetime, this lag was not taken into account when constructing the stochastic model and hence it produced simulated cash flows not in line with those used in the calibration. / En investering i private equity är en investering i en tillgång som inte är börsnoterade. På grund av detta är sådana tillgångar väldigt svåra att värdera och medför även store svårigheter när det kommer till att kvantifiera risk. I en typisk private equity investering so ingår en investerare i ett löfte att under en viss förbestämd tidsperiod bidra med en fixt mängd kapital till en private equity fond. Detta kapital kommer att gradvis kallas på av fonden vid behov för att sedan mot slutet av fondens livstid ge utdelning när private equity fonden börjar göra en vinst. På detta viset kan en private equity investering brytas ner i två kassaflöden, kontributioner in i fonden, och distributioner ut ur fonden. Dessa kassaflöden sker under en förbestämd tidsperiod men ej förbestämda belopp. Som en investerare i denna typen av fond är därför en risk att bära för lite likvid kapital när kontributioner blir kallade på men även oattraktivt att bäre på för mycket de detta representerar förlorar potentiell avkastning. Målet i denna uppsatts är att hitta ett sätt att på att tillförlitligt vis modellera dessa kassaflöden och representera resultaten på ett meningsfullt sätt från perspektivet av en investerare. För att uppnå detta skapades value-at-risk liknande mått för mängden likvid kapital som krävs under en tidsperiod för att säkra sig mot påkallade kontributioner. Slutsatsen blev att distributioner kunde modelleras väl, både när det kom till att efterlikna den genomsnittliga vägen av kassaflöden och även för att modellera risken. I kontrast till detta så kunde inte kontributioner modelleras mot tillräckligt hög säkerhet för att användes i det ämnade syftena. Anledningen till detta var en eftersläpning i hastigheten som kontributioner kallades med som inte tågs i beaktande av den tillämpade matematiska modellen.
123

Modelling Credit Spread Risk in the Banking Book (CSRBB) / Modellering av kreditspreadrisken i bankboken (CSRBB)

Pahne, Elsa, Åkerlund, Louise January 2023 (has links)
Risk measurement tools and strategies have until recently been calibrated for a low-for-long interest rate environment. However, in the current higher interest rate environment, banking supervisory entities have intensified their regulatory pressure on institutions to enhance their assessment and monitoring of interest rate risk and credit spread risk. The European Banking Authority (EBA) has released updated guidelines on the assessment and monitoring of Credit Spread Risk in the Banking Book (CSRBB), which will replace the current guidelines by 31st December 2023. The new guidelines identify the CSRBB as a separate risk category apart from Interest Rate Risk in the Banking Book (IRRBB), and specifies the inclusion of liabilities in therisk calculations. This paper proposes a CSRBB model that conforms to the updated EBA guidelines. The model uses a historical simulation Value at Risk (HSVaR) and Expected Shortfall (ES) approach, and includes a 90-day holding period, as suggested by Finansinspektionen (FI). To assess the effectiveness of the model, it is compared with a standardised model of FI, and subjected to backtesting. Additionally, the paper suggests modifications to the model to obtain more conservative results. / Riskmätningsverktyg och strategier har sedan nyligen anpassats till en lågräntemiljö. Dock till följd av den nuvarande högre räntemiljön har tillsynsmyndigheter för bankväsendet satt ökat tryck på institutioners utvärdering och rapportering av ränterisk och kreditspreadrisk. Den Europeiska Bankmyndigheten (EBA) har publicerat uppdaterade riktlinjer för bedömning och rapportering av kreditspreadsrisken i bankboken (CSRBB), som ersätter de nuvarande riktlinjerna den 31 december 2023. De nya riktlinjerna identifierar CSRBB som en separat riskkategori från ränterisk i bankboken (IRRBB) och specificerar inkluderingen av skulder i riskberäkningarna. Denna uppsats föreslår en CSRBB-modell som följer EBAs uppdaterade riktlinjer. Modellen använder en Value at Risk (VaR) metodik baserat på historiska simulationer och Expected Shortfall (ES), samt antar en 90-dagars innehavsperiod som föreslås av Finansinspektionen (FI). Modellens effektivitet utvärderas genom en jämförelse med FIs standardmodell för kreditspreadrisken i bankboken, samt genom backtesting. Slutligen diskuteras möjliga justeringar av modellen för att uppnå mer konservativa resultat.
124

Essays on Computational Problems in Insurance

Ha, Hongjun 31 July 2016 (has links)
This dissertation consists of two chapters. The first chapter establishes an algorithm for calculating capital requirements. The calculation of capital requirements for financial institutions usually entails a reevaluation of the company's assets and liabilities at some future point in time for a (large) number of stochastic forecasts of economic and firm-specific variables. The complexity of this nested valuation problem leads many companies to struggle with the implementation. The current chapter proposes and analyzes a novel approach to this computational problem based on least-squares regression and Monte Carlo simulations. Our approach is motivated by a well-known method for pricing non-European derivatives. We study convergence of the algorithm and analyze the resulting estimate for practically important risk measures. Moreover, we address the problem of how to choose the regressors, and show that an optimal choice is given by the left singular functions of the corresponding valuation operator. Our numerical examples demonstrate that the algorithm can produce accurate results at relatively low computational costs, particularly when relying on the optimal basis functions. The second chapter discusses another application of regression-based methods, in the context of pricing variable annuities. Advanced life insurance products with exercise-dependent financial guarantees present challenging problems in view of pricing and risk management. In particular, due to the complexity of the guarantees and since practical valuation frameworks include a variety of stochastic risk factors, conventional methods that are based on the discretization of the underlying (Markov) state space may not be feasible. As a practical alternative, this chapter explores the applicability of Least-Squares Monte Carlo (LSM) methods familiar from American option pricing in this context. Unlike previous literature we consider optionality beyond surrendering the contract, where we focus on popular withdrawal benefits - so-called GMWBs - within Variable Annuities. We introduce different LSM variants, particularly the regression-now and regression-later approaches, and explore their viability and potential pitfalls. We commence our numerical analysis in a basic Black-Scholes framework, where we compare the LSM results to those from a discretization approach. We then extend the model to include various relevant risk factors and compare the results to those from the basic framework.
125

Managing the extremes : An application of extreme value theory to financial risk management

Strömqvist, Zakris, Petersen, Jesper January 2016 (has links)
We compare the traditional GARCH models with a semiparametric approach based on extreme value theory and find that the semiparametric approach yields more accurate predictions of Value-at-Risk (VaR). Using traditional parametric approaches based on GARCH and EGARCH to model the conditional volatility, we calculate univariate one-day ahead predictions of Value-at-Risk (VaR) under varying distributional assumptions. The accuracy of these predictions is then compared to that of a semiparametric approach, based on results from extreme value theory. For the 95% VaR, the EGARCH’s ability to incorporate the asymmetric behaviour of return volatility proves most useful. For higher quantiles, however, we show that what matters most for predictive accuracy is the underlying distributional assumption of the innovations, where the normal distribution falls behind other distributions which allow for thicker tails. Both the semiparametric approach and the conditional volatility models based on the t-distribution outperform the normal, especially at higher quantiles. As for the comparison between the semiparametric approach and the conditional volatility models with t-distributed innovations, the results are mixed. However, the evidence indicates that there certainly is a place for extreme value theory in financial risk management.
126

Coherent Beta Risk Measures for Capital Requirements

Wirch, Julia Lynn January 1999 (has links)
This thesis compares insurance premium principles with current financial risk paradigms and uses distorted probabilities, a recent development in premium principle literature, to synthesize the current models for financial risk measures in banking and insurance. This work attempts to broaden the definition of value-at-risk beyond the percentile measures. Examples are used to show how the percentile measure fails to give consistent results, and how it can be manipulated. A new class of consistent risk measures is investigated.
127

Supply chain network design under uncertainty and risk

Hollmann, Dominik January 2011 (has links)
We consider the research problem of quantitative support for decision making in supply chain network design (SCND). We first identify the requirements for a comprehensive SCND as (i) a methodology to select uncertainties, (ii) a stochastic optimisation model, and (iii) an appropriate solution algorithm. We propose a process to select a manageable number of uncertainties to be included in a stochastic program for SCND. We develop a comprehensive two-stage stochastic program for SCND that includes uncertainty in demand, currency exchange rates, labour costs, productivity, supplier costs, and transport costs. Also, we consider conditional value at risk (CV@R) to explore the trade-off between risk and return. We use a scenario generator based on moment matching to represent the multivariate uncertainty. The resulting stochastic integer program is computationally challenging and we propose a novel iterative solution algorithm called adaptive scenario refinement (ASR) to process the problem. We describe the rationale underlying ASR, validate it for a set of benchmark problems, and discuss the benefits of the algorithm applied to our SCND problem. Finally, we demonstrate the benefits of the proposed model in a case study and show that multiple sources of uncertainty and risk are important to consider in the SCND. Whereas in the literature most research is on demand uncertainty, our study suggests that exchange rate uncertainty is more important for the choice of optimal supply chain strategies in international production networks. The SCND model and the use of the coherent downside risk measure in the stochastic program are innovative and novel; these and the ASR solution algorithm taken together make contributions to knowledge.
128

Topics in financial market risk modelling

Ma, Zishun January 2012 (has links)
The growth of the financial risk management industry has been motivated by the increased volatility of financial markets combined with the rapid innovation of derivatives. Since the 1970s, several financial crises have occurred globally with devastating consequences for financial and non-financial institutions and for the real economy. The most recent US subprime crisis led to enormous losses for financial and non-financial institutions and to a recession in many countries including the US and UK. A common lesson from these crises is that advanced financial risk management systems are required. Financial risk management is a continuous process of identifying, modeling, forecasting and monitoring risk exposures arising from financial investments. The Value at Risk (VaR) methodology has served as one of the most important tools used in this process. This quantitative tool, which was first invented by JPMorgan in its Risk-Metrics system in 1995, has undergone a considerable revolution and development during the last 15 years. It has now become one of the most prominent tools employed by financial institutions, regulators, asset managers and nonfinancial corporations for risk measurement. My PhD research undertakes a comprehensive and practical study of market risk modeling in modern finance using the VaR methodology. Two newly developed risk models are proposed in this research, which are derived by integrating volatility modeling and the quantile regression technique. Compared to the existing risk models, these two new models place more emphasis on dynamic risk adjustment. The empirical results on both real and simulated data shows that under certain circumstances, the risk prediction generated from these models is more accurate and efficient in capturing time varying risk evolution than traditional risk measures. Academically, the aim of this research is to make some improvements and extensions of the existing market risk modeling techniques. In practice, the purpose of this research is to support risk managers developing a dynamic market risk measurement system, which will function well for different market states and asset categories. The system can be used by financial institutions and non-financial institutions for either passive risk measurement or active risk control.
129

Caveat Emptor: Does Bitcoin Improve Portfolio Diversification?

Gasser, Stephan, Eisl, Alexander, Weinmayer, Karl January 2014 (has links) (PDF)
Bitcoin is an unregulated digital currency originally introduced in 2008 without legal tender status. Based on a decentralized peer-to-peer network to confirm transactions and generate a limited amount of new bitcoins, it functions without the backing of a central bank or any other monitoring authority. In recent years, Bitcoin has seen increasing media coverage and trading volume, as well as major capital gains and losses in a high volatility environment. Interestingly, an analysis of Bitcoin returns shows remarkably low correlations with traditional investment assets such as other currencies, stocks, bonds or commodities such as gold or oil. In this paper, we shed light on the impact an investment in Bitcoin can have on an already well-diversified investment portfolio. Due to the non-normal nature of Bitcoin returns, we do not propose the classic mean-variance approach, but adopt a Conditional Value-at-Risk framework that does not require asset returns to be normally distributed. Our results indicate that Bitcoin should be included in optimal portfolios. Even though an investment in Bitcoin increases the CVaR of a portfolio, this additional risk is overcompensated by high returns leading to better return-risk ratios.
130

Bivariate extreme value analysis of commodity prices

Joyce, Matthew 21 April 2017 (has links)
The crude oil, natural gas, and electricity markets are among the most widely traded and talked about commodity markets across the world. Over the past two decades each commodity has seen price volatility due to political, economic, social, and technological reasons. With that comes a significant amount of risk that both corporations and governments must account for to ensure expected cash flows and to minimize losses. This thesis analyzes the portfolio risk of the major US commodity hubs for crude oil, natural gas and electricity by applying Extreme Value Theory to historical daily price returns between 2003 and 2013. The risk measures used to analyze risk are Value-at-Risk and Expected Shortfall, with these estimated by fitting the Generalized Pareto Distribution to the data using the peak-over-threshold method. We consider both the univariate and bivariate cases in order to determine the effects that price shocks within and across commodities will have in a mixed portfolio. The results show that electricity is the most volatile, and therefore most risky, commodity of the three markets considered for both positive and negative returns. In addition, we find that the univariate and bivariate results are statistically indistinguishable, leading to the conclusion that for the three markets analyzed during this period, price shocks in one commodity does not directly impact the volatility of another commodity’s price. / Graduate

Page generated in 0.0961 seconds