• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 185
  • 109
  • 40
  • 29
  • 24
  • 18
  • 18
  • 13
  • 11
  • 11
  • 6
  • 5
  • 5
  • 4
  • 4
  • Tagged with
  • 489
  • 489
  • 483
  • 87
  • 87
  • 76
  • 75
  • 67
  • 67
  • 66
  • 61
  • 59
  • 55
  • 55
  • 48
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Optimalizace parametrů zajištění v pojišťovnictví / Optimization of reinsurace parameters in insurance

Dlouhá, Veronika January 2017 (has links)
This thesis is dedicated to searching optimal parameters of reinsurance with a focus of quota-share and stop-loss reinsurance. The optimization is based on minimization of value at risk and conditional value at risk of total costs of the insurer for the recieved risk. It also presents a compound random variable and shows various methods of obtaining its probability distribution, for example ap- proximation by lognormal or gamma mixtures distributions or by Panjer recurive method for continuous severity and numerical method of its solution. At the end of the thesis we can find the calculation of the optimal parameters of reinsurance for a compound random variable based on real data. We use various methods to determine probability distribution and premiums. 1
122

[en] SIMULATION AND STOCHASTIC OPTIMIZATION FOR ENERGY CONTRACTING OF LARGE CONSUMERS / [pt] SIMULAÇÃO E OTIMIZAÇÃO ESTOCÁSTICA PARA CONTRATAÇÃO DE ENERGIA ELÉTRICA DE GRANDES CONSUMIDORES

EIDY MARIANNE MATIAS BITTENCOURT 09 November 2016 (has links)
[pt] A contratação de energia elétrica no Brasil por parte de grandes consumidores é feita de acordo com o nível de tensão e considerando dois ambientes: o Ambiente Regulado e o Ambiente Livre. Os grandes consumidores são aqueles que possuem carga igual ou superior a 3 MW, atendidos em qualquer nível de tensão e a energia pode ser contratada em quaisquer desses ambientes. Um grande desafio para esses consumidores é determinar a melhor alternativa de contratação. Para tratar este problema, é preciso ter em conta que o consumo de energia e a demanda de potência requerida são variáveis desconhecidas no momento da contratação do consumidor, sendo necessário estimá-las. Esta dissertação propõe atacar este problema por uma metodologia que envolve simulação de cenários futuros de demanda máxima de potência e energia total consumida e otimização estocástica dos cenários simulados para definir o melhor contrato. Dada a natureza estocástica do problema, empregou-se o CVaR (Conditional Value at Risk) como medida de risco para o problema de otimização. Para ilustrar, os resultados da contratação foram obtidos para um grande consumidor real considerando a modalidade Verde A4 no Ambiente Regulado e um contrato de quantidade no Ambiente Livre. / [en] The energy contracting in Brazil for large consumers is done according to the voltage level and considering two environments: the Regulated Environment and the Free Environment. Large consumers are those characterized by installed load equal to or greater than 3 MW, supplied at any voltage level and its energy contract can be chosen between any of these two environments. A major challenge for these consumers is to determine the best alternative of contracting. To address this problem, it must be taken into account that the energy consumption and the required power demand are unknown variables by the time of consumer contracting, being necessary to estimate them. This dissertation proposes to tackle this problem by a methodology based on the simulation of future scenarios of maximum power demand and total consumed energy and on stochastic optimization of these simulated scenarios in order to define the best contract. Given the stochastic nature of the problem, it was used the CVaR (Conditional Value at Risk) as a measure of risk for the optimization problem. To illustrate, the contracting results were obtained for a large real consumer considering the Green Tariff group A4 in the Regulated Environment and a quantity contract in the Free Environment.
123

Distributional Dynamics of Fama-French Factors in European Markets / Tidsvarierande fördelningar för Fama-French-faktorer på europeiska marknader

Löfgren, Wilmer January 2020 (has links)
The three-factor model of Fama and French has proved to be a seminal contribution to asset pricing theory, and was recently extended to include two more factors, yielding the Fama-French five-factor model. Other proposed augmentations of the three-factor model includes the introduction of a momentum factor by Carthart. The extensive use of such factors in asset pricing theory and investing motivates the study of the distributional properties of the returns of these factors. However, previous studies have focused on subsets of these six factors on the U.S. market. In this thesis, the distributional properties of daily log-returns of the five Fama-French factors and the Carthart momentum factor in European data from 2009 to 2019 are examined. The univariate distributional dynamics of the factor log-returns are modelled as ARMA-NGARCH processes with skewed t distributed driving noise sequences. The Gaussian and t copula are then used to model the joint distributions of these factor log-returns. The models developed are applied to estimate the one-day ahead Value-at-Risk (VaR) in testing data. The estimations of the VaR are backtested to check for correct unconditional coverage and exponentially distributed durations between exceedances. The results suggest that the ARMA-NGARCH processes are a valid approximation of the factor log-returns, and lead to good estimations of the VaR. The results of the multivariate analysis suggest that constant Gaussian and t copulas might be insufficient to model the dependence structure of the factors, and that there might be a need for more flexible copula models with dynamic correlations between factor log-returns. / Fama och Frenchs trefaktormodell har blivit en populär modell för aktieavkastning, och utvidgades nyligen av Fama och French genom att två ytterligare faktorer lades till för att skapa en femfaktormodell. Carthart föreslår en annan modell där trefaktormodellen kompletteras med en momentumfaktor. Då dessa faktorer används inom både akademiska sammanhang och kapitalförvaltning finns det ett tydligt behov av att undersöka vilka egenskaper fördelningen av faktorernas avkastning har. Dock har tidigare sådan forskning inte undersökt detta för alla sex faktorer, och endast använt data från USA:s marknad. I detta examensarbete undersökt därför sannolikhetsfördelningen för den logaritmiska dagliga avkastningen av de fem Fama-French-faktorerna och Cartharts momentumfaktor i europeisk data från åren 2009 till 2019. De endimensionella sannolikhetsfördelningarna modelleras som dynamiska med hjälp av ARMA-NGARCH-processer med feltermer som är fördelade enligt en generaliserad t-fördelning som tillåter skevhet. För att modellera multivariata fördelningar används en Gaussisk copula och en t-copula. De erhållna modellerna används sedan för att uppskatta daglig Value-at-Risk (VaR) i testdata. Dessa uppskattningar av VaR genomgår sedan statistiska test för att undersöka om antalet överträdelser är korrekt och tiderna mellan varje överträdelse är exponentialfördelade. Resultaten i detta examensarbete tyder på att ARMA-NGARCH-processer är en bra approximation av faktorernas logaritmiska dagliga avkastning, och ger bra uppskattningar av VaR. Resultaten för den multivariata analysen tyder på att en konstant copula kan vara en otillräcklig modell för beroendestrukturen mellan faktorerna, och att det möjligen finns ett behov av att använda mer flexibla copula-modeller med en dynamisk korrelation mellan faktorernas logaritmiska avkastning.
124

Stochastic Modelling of Cash Flows in Private Equity / Stokastisk modellering av kassaflöden i private equity

Ungsgård, Oscar January 2020 (has links)
An investment in a private equity is any investment made in a financial asset that is not publicly traded. As such these assets are very difficult to value and also give rise to great difficulty when it comes to quantifying risk. In a typical private equity investment the investor commits a prespecified amount of capital to a fund, this capital will be called upon as needed by the fund and eventually capital will be returned to the investor by the fund as it starts to turn a profit. In this way a private equity investment can be boiled down to consist of two cash flows, the contributions to the fund and distributions from the fund to the investor. These cash flows are usually made within a prespecified time frame but at unspecified intervals and amounts. As an investor in a fund, carrying too little liquid assets when contributions are called upon will cause trouble, but carrying significantly more than needed is also not desirable as it represents a loss in potential revenue from having less capital in more profitable investments. The goal of this thesis was to attempt to find a way to reliably model these cash flows and to find a way to represent the results in a meaningful way for the benefit of the investor by constructing value at risk like risk measures for the necessary liquid capital to carry at a given time in case contributions are called upon. It was found that the distributions could be modelled very well with the chosen stochastic processes, both as it related to predicting the average path of the cash flows and as it relates to modelling the variability of them. Contrary to this it was found that the contributions could not be modelled very well. The reason for this was found to be an observed lag in the speed of contributions at the start of the funds lifetime, this lag was not taken into account when constructing the stochastic model and hence it produced simulated cash flows not in line with those used in the calibration. / En investering i private equity är en investering i en tillgång som inte är börsnoterade. På grund av detta är sådana tillgångar väldigt svåra att värdera och medför även store svårigheter när det kommer till att kvantifiera risk. I en typisk private equity investering so ingår en investerare i ett löfte att under en viss förbestämd tidsperiod bidra med en fixt mängd kapital till en private equity fond. Detta kapital kommer att gradvis kallas på av fonden vid behov för att sedan mot slutet av fondens livstid ge utdelning när private equity fonden börjar göra en vinst. På detta viset kan en private equity investering brytas ner i två kassaflöden, kontributioner in i fonden, och distributioner ut ur fonden. Dessa kassaflöden sker under en förbestämd tidsperiod men ej förbestämda belopp. Som en investerare i denna typen av fond är därför en risk att bära för lite likvid kapital när kontributioner blir kallade på men även oattraktivt att bäre på för mycket de detta representerar förlorar potentiell avkastning. Målet i denna uppsatts är att hitta ett sätt att på att tillförlitligt vis modellera dessa kassaflöden och representera resultaten på ett meningsfullt sätt från perspektivet av en investerare. För att uppnå detta skapades value-at-risk liknande mått för mängden likvid kapital som krävs under en tidsperiod för att säkra sig mot påkallade kontributioner. Slutsatsen blev att distributioner kunde modelleras väl, både när det kom till att efterlikna den genomsnittliga vägen av kassaflöden och även för att modellera risken. I kontrast till detta så kunde inte kontributioner modelleras mot tillräckligt hög säkerhet för att användes i det ämnade syftena. Anledningen till detta var en eftersläpning i hastigheten som kontributioner kallades med som inte tågs i beaktande av den tillämpade matematiska modellen.
125

Modelling Credit Spread Risk in the Banking Book (CSRBB) / Modellering av kreditspreadrisken i bankboken (CSRBB)

Pahne, Elsa, Åkerlund, Louise January 2023 (has links)
Risk measurement tools and strategies have until recently been calibrated for a low-for-long interest rate environment. However, in the current higher interest rate environment, banking supervisory entities have intensified their regulatory pressure on institutions to enhance their assessment and monitoring of interest rate risk and credit spread risk. The European Banking Authority (EBA) has released updated guidelines on the assessment and monitoring of Credit Spread Risk in the Banking Book (CSRBB), which will replace the current guidelines by 31st December 2023. The new guidelines identify the CSRBB as a separate risk category apart from Interest Rate Risk in the Banking Book (IRRBB), and specifies the inclusion of liabilities in therisk calculations. This paper proposes a CSRBB model that conforms to the updated EBA guidelines. The model uses a historical simulation Value at Risk (HSVaR) and Expected Shortfall (ES) approach, and includes a 90-day holding period, as suggested by Finansinspektionen (FI). To assess the effectiveness of the model, it is compared with a standardised model of FI, and subjected to backtesting. Additionally, the paper suggests modifications to the model to obtain more conservative results. / Riskmätningsverktyg och strategier har sedan nyligen anpassats till en lågräntemiljö. Dock till följd av den nuvarande högre räntemiljön har tillsynsmyndigheter för bankväsendet satt ökat tryck på institutioners utvärdering och rapportering av ränterisk och kreditspreadrisk. Den Europeiska Bankmyndigheten (EBA) har publicerat uppdaterade riktlinjer för bedömning och rapportering av kreditspreadsrisken i bankboken (CSRBB), som ersätter de nuvarande riktlinjerna den 31 december 2023. De nya riktlinjerna identifierar CSRBB som en separat riskkategori från ränterisk i bankboken (IRRBB) och specificerar inkluderingen av skulder i riskberäkningarna. Denna uppsats föreslår en CSRBB-modell som följer EBAs uppdaterade riktlinjer. Modellen använder en Value at Risk (VaR) metodik baserat på historiska simulationer och Expected Shortfall (ES), samt antar en 90-dagars innehavsperiod som föreslås av Finansinspektionen (FI). Modellens effektivitet utvärderas genom en jämförelse med FIs standardmodell för kreditspreadrisken i bankboken, samt genom backtesting. Slutligen diskuteras möjliga justeringar av modellen för att uppnå mer konservativa resultat.
126

Essays on Computational Problems in Insurance

Ha, Hongjun 31 July 2016 (has links)
This dissertation consists of two chapters. The first chapter establishes an algorithm for calculating capital requirements. The calculation of capital requirements for financial institutions usually entails a reevaluation of the company's assets and liabilities at some future point in time for a (large) number of stochastic forecasts of economic and firm-specific variables. The complexity of this nested valuation problem leads many companies to struggle with the implementation. The current chapter proposes and analyzes a novel approach to this computational problem based on least-squares regression and Monte Carlo simulations. Our approach is motivated by a well-known method for pricing non-European derivatives. We study convergence of the algorithm and analyze the resulting estimate for practically important risk measures. Moreover, we address the problem of how to choose the regressors, and show that an optimal choice is given by the left singular functions of the corresponding valuation operator. Our numerical examples demonstrate that the algorithm can produce accurate results at relatively low computational costs, particularly when relying on the optimal basis functions. The second chapter discusses another application of regression-based methods, in the context of pricing variable annuities. Advanced life insurance products with exercise-dependent financial guarantees present challenging problems in view of pricing and risk management. In particular, due to the complexity of the guarantees and since practical valuation frameworks include a variety of stochastic risk factors, conventional methods that are based on the discretization of the underlying (Markov) state space may not be feasible. As a practical alternative, this chapter explores the applicability of Least-Squares Monte Carlo (LSM) methods familiar from American option pricing in this context. Unlike previous literature we consider optionality beyond surrendering the contract, where we focus on popular withdrawal benefits - so-called GMWBs - within Variable Annuities. We introduce different LSM variants, particularly the regression-now and regression-later approaches, and explore their viability and potential pitfalls. We commence our numerical analysis in a basic Black-Scholes framework, where we compare the LSM results to those from a discretization approach. We then extend the model to include various relevant risk factors and compare the results to those from the basic framework.
127

Managing the extremes : An application of extreme value theory to financial risk management

Strömqvist, Zakris, Petersen, Jesper January 2016 (has links)
We compare the traditional GARCH models with a semiparametric approach based on extreme value theory and find that the semiparametric approach yields more accurate predictions of Value-at-Risk (VaR). Using traditional parametric approaches based on GARCH and EGARCH to model the conditional volatility, we calculate univariate one-day ahead predictions of Value-at-Risk (VaR) under varying distributional assumptions. The accuracy of these predictions is then compared to that of a semiparametric approach, based on results from extreme value theory. For the 95% VaR, the EGARCH’s ability to incorporate the asymmetric behaviour of return volatility proves most useful. For higher quantiles, however, we show that what matters most for predictive accuracy is the underlying distributional assumption of the innovations, where the normal distribution falls behind other distributions which allow for thicker tails. Both the semiparametric approach and the conditional volatility models based on the t-distribution outperform the normal, especially at higher quantiles. As for the comparison between the semiparametric approach and the conditional volatility models with t-distributed innovations, the results are mixed. However, the evidence indicates that there certainly is a place for extreme value theory in financial risk management.
128

Coherent Beta Risk Measures for Capital Requirements

Wirch, Julia Lynn January 1999 (has links)
This thesis compares insurance premium principles with current financial risk paradigms and uses distorted probabilities, a recent development in premium principle literature, to synthesize the current models for financial risk measures in banking and insurance. This work attempts to broaden the definition of value-at-risk beyond the percentile measures. Examples are used to show how the percentile measure fails to give consistent results, and how it can be manipulated. A new class of consistent risk measures is investigated.
129

Supply chain network design under uncertainty and risk

Hollmann, Dominik January 2011 (has links)
We consider the research problem of quantitative support for decision making in supply chain network design (SCND). We first identify the requirements for a comprehensive SCND as (i) a methodology to select uncertainties, (ii) a stochastic optimisation model, and (iii) an appropriate solution algorithm. We propose a process to select a manageable number of uncertainties to be included in a stochastic program for SCND. We develop a comprehensive two-stage stochastic program for SCND that includes uncertainty in demand, currency exchange rates, labour costs, productivity, supplier costs, and transport costs. Also, we consider conditional value at risk (CV@R) to explore the trade-off between risk and return. We use a scenario generator based on moment matching to represent the multivariate uncertainty. The resulting stochastic integer program is computationally challenging and we propose a novel iterative solution algorithm called adaptive scenario refinement (ASR) to process the problem. We describe the rationale underlying ASR, validate it for a set of benchmark problems, and discuss the benefits of the algorithm applied to our SCND problem. Finally, we demonstrate the benefits of the proposed model in a case study and show that multiple sources of uncertainty and risk are important to consider in the SCND. Whereas in the literature most research is on demand uncertainty, our study suggests that exchange rate uncertainty is more important for the choice of optimal supply chain strategies in international production networks. The SCND model and the use of the coherent downside risk measure in the stochastic program are innovative and novel; these and the ASR solution algorithm taken together make contributions to knowledge.
130

Topics in financial market risk modelling

Ma, Zishun January 2012 (has links)
The growth of the financial risk management industry has been motivated by the increased volatility of financial markets combined with the rapid innovation of derivatives. Since the 1970s, several financial crises have occurred globally with devastating consequences for financial and non-financial institutions and for the real economy. The most recent US subprime crisis led to enormous losses for financial and non-financial institutions and to a recession in many countries including the US and UK. A common lesson from these crises is that advanced financial risk management systems are required. Financial risk management is a continuous process of identifying, modeling, forecasting and monitoring risk exposures arising from financial investments. The Value at Risk (VaR) methodology has served as one of the most important tools used in this process. This quantitative tool, which was first invented by JPMorgan in its Risk-Metrics system in 1995, has undergone a considerable revolution and development during the last 15 years. It has now become one of the most prominent tools employed by financial institutions, regulators, asset managers and nonfinancial corporations for risk measurement. My PhD research undertakes a comprehensive and practical study of market risk modeling in modern finance using the VaR methodology. Two newly developed risk models are proposed in this research, which are derived by integrating volatility modeling and the quantile regression technique. Compared to the existing risk models, these two new models place more emphasis on dynamic risk adjustment. The empirical results on both real and simulated data shows that under certain circumstances, the risk prediction generated from these models is more accurate and efficient in capturing time varying risk evolution than traditional risk measures. Academically, the aim of this research is to make some improvements and extensions of the existing market risk modeling techniques. In practice, the purpose of this research is to support risk managers developing a dynamic market risk measurement system, which will function well for different market states and asset categories. The system can be used by financial institutions and non-financial institutions for either passive risk measurement or active risk control.

Page generated in 0.0761 seconds