• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 183
  • 109
  • 40
  • 29
  • 23
  • 18
  • 18
  • 13
  • 11
  • 10
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 483
  • 483
  • 483
  • 87
  • 85
  • 75
  • 74
  • 67
  • 66
  • 64
  • 61
  • 59
  • 55
  • 55
  • 48
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
411

[en] HEDGING RENEWABLE ENERGY SALES IN THE BRAZILIAN CONTRACT MARKET VIA ROBUST OPTIMIZATION / [pt] MODELO DE CONTRATAÇÃO PARA FONTES RENOVÁVEIS COM RUBUSTEZ AO PREÇO DE CURTO-PRAZO

BRUNO FANZERES DOS SANTOS 26 March 2018 (has links)
[pt] O preço da energia no mercado de curto-prazo é caracterizado pela sua alta volatilidade e dificuldade de previsão, representando um alto risco para agentes produtores de energia, especialmente para geradores por fontes renováveis. A abordagem típica empregada por tais empresas para obter a estratégia de contratação ótima de médio e longo prazos é simular um conjunto de caminhos para os fatores de incerteza a fim de caracterizar a distribuição de probabilidade da receita futura e, então, otimizar o portfólio da empresa, maximizando o seu equivalente certo. Contudo, na prática, a modelagem e simulação do preço de curto prazo da energia é um grande desafio para os agentes do setor elétrico devido a sua alta dependência a parâmetros que são difíceis de prever no médio e longo, como o crescimento do PIB, variação da demanda, entrada de novos agentes no mercado, alterações regulatórias, entre outras. Neste sentido, nesta dissertação, utilizamos otimização robusta para tratar a incerteza presente na distribuição do preço de curto-prazo da energia, enquanto a produção de energia renovável é tratada com cenários simulados exógenos, como é comum em programação estocástica. Mostramos, também, que esta abordagem pode ser interpretada a partir de dois pontos de vista: teste de estresse e aversão à ambiguidade. Com relação ao último, apresentamos um link entre otimização robusta e teoria de ambiguidade. Além disso, incluímos no modelo de formação de portfólio ótimo a possibilidade de considerar um contrato de opção térmica de compra para o hedge do portfólio do agente contra a irregularidade do preço de curto-prazo. Por fim, é apresentado um estudo de caso com dados realistas do sistema elétrico brasileiro para ilustrar a aplicabilidade da metodologia proposta. / [en] Energy spot price is characterized by its high volatility and difficult prediction, representing a major risk for energy companies, especially those that rely on renewable generation. The typical approach employed by such companies to address their mid- and long-term optimal contracting strategy is to simulate a large set of paths for the uncertainty factors to characterize the probability distribution of the future income and, then, optimize the company s portfolio to maximize its certainty equivalent. In practice, however, spot price modeling and simulation is a big challenge for agents due to its high dependence on parameters that are difficult to predict, e.g., GDP growth, demand variation, entrance of new market players, regulatory changes, just to name a few. In this sense, in this dissertation, we make use of robust optimization to treat the uncertainty on spot price distribution while renewable production remains accounted for by exogenously simulated scenarios, as is customary in stochastic programming. We show that this approach can be interpreted from two different point of views: stress test and aversion to ambiguity. Regarding the latter, we provide a link between robust optimization and ambiguity theory, which was an open gap in decision theory. Moreover, we include into the optimal portfolio model, the possibility to consider an energy call option contract to hedge the agent s portfolio against price spikes. A case study with realistic data from the Brazilian system is shown to illustrate the applicability of the proposed methodology.
412

Allocation dynamique de portefeuille avec profil de gain asymétrique : risk management, incitations financières et benchmarking / Dynamic asset allocation with asymmetric payoffs : risk management, financial incentives, and benchmarking

Tergny, Guillaume 31 May 2011 (has links)
Les gérants de portefeuille pour compte de tiers sont souvent jugés par leur performance relative à celle d'un portefeuille benchmark. A ce titre, ils sont amenés très fréquemment à utiliser des modèles internes de "risk management" pour contrôler le risque de sous-performer le benchmark. Par ailleurs, ils sont de plus en plus nombreux à adopter une politique de rémunération incitative, en percevant une commission de sur-performance par rapport au benchmark. En effet, cette composante variable de leur rémunération leur permet d'augmenter leur revenu en cas de sur-performance sans contrepartie en cas de sous-performance. Or de telles pratiques ont fait récemment l'objet de nombreuses polémiques : la période récente de crise financière mondiale a fait apparaître certaines carences de plusieurs acteurs financiers en terme de contrôle de risque ainsi que des niveaux de prise de risque et de rémunération jugés excessifs. Cependant, l'étude des implications de ces pratiques reste un thème encore relativement peu exploré dans le cadre de la théorie classique des choix dynamiques de portefeuille en temps continu. Cette thèse analyse, dans ce cadre théorique, les implications de ces pratiques de "benchmarking" sur le comportement d'investissement de l'asset manager. La première partie étudie les propriétés de la stratégie dynamique optimale pour l'asset manager concerné par l'écart entre la rentabilité de son portefeuille et celle d'un benchmark fixe ou stochastique (sur ou sous-performance). Nous considérons plusieurs types d'asset managers, caractérisés par différentes fonctions d'utilité et qui sont soumis à différentes contraintes de risque de sous-performance. Nous montrons en particulier quel est le lien entre les problèmes d'investissement avec prise en compte de l'aversion à la sous-performance et avec contrainte explicite de "risk management". Dans la seconde partie, on s'intéresse à l'asset manager bénéficiant d'une rémunération incitative (frais de gestion variables, bonus de sur-performance ou commission sur encours additionnelle). On étudie, selon la forme de ses incitations financières et son degré d'aversion à la sous-performance, comment sa stratégie d'investissement s'écarte de celle de l'investisseur (ou celle de l'asset manager sans rémunération incitative). Nous montrons que le changement de comportement de l'asset manager peut se traduire soit par une réduction du risque pris par rapport à la stratégie sans incitation financière soit au contraire par une augmentation de celui-ci. Finalement, nous montrons en quoi la présence de contraintes de risque de sous-performance, imposées au gérant ou traduisant son aversion à la sous-performance, peut être bénéfique à l'investisseur donnant mandat de gestion financière. / It is common practice to judge third-party asset managers by looking at their financial performance relative to a benchmark portfolio. For this reason, they often choose to rely on internal risk-management models to control the downside risk of their portfolio relative to the benchmark. Moreover, an increasing number are adopting an incentive-based scheme, by charging an over-performance commission relative to the benchmark. Indeed, including this variable component in their global remuneration allows them to increase their revenue in case of over-performance without any penalty in the event of underperforming the benchmark. However, such practices have recently been at the heart of several polemics: the recent global financial crisis has uncovered some shortcomings in terms of internal risk control as well as excessive risk-taking and compensation levels of several financial players. Nevertheless, it appears that analyzing the impact of these practices remains a relatively new issue in continuous time-dynamic asset allocation theory. This thesis analyses in this theoretical framework the implications of these "benchmarking" practices on the asset manager's investment behavior. The first part examines the properties of the optimal dynamic strategy for the asset manager who is concerned by the difference of return between their portfolio and a fix or stochastic benchmark (over- or under-performance). Several asset manager types are considered, defined by different utility functions and different downside-risk constraints. In particular, the link between investment problems with aversion to under-performance and risk management constraints is shown. In the second part, the case of the asset manager who benefits from an incentive compensation scheme (variable asset management fees, over-performance bonuses or additional commission on asset under management), is investigated. We study how, depending on the choice of financial inventive structure and loss aversion level, the asset manager's strategy differs from that of the investor (or the strategy of the asset manager receiving no incentive remuneration). This study shows that the change in investment behavior of the asset manager can lead to both a reduction in the risk taken relative to the strategy without financial incentives or conversely an increase thereof. Finally we show that the existence of downside risk constraints, imposed on the asset manager or corresponding to their aversion for under-performance, can be beneficial to the investor mandating financial management.
413

[en] STOCHASTIC ANALYSIS OF ECONOMIC VIABILITY OF PHOTOVOLTAIC PANELS INSTALLATION IN LARGE CONSUMERS / [pt] ANÁLISE ESTOCÁSTICA DA VIABILIDADE ECONÔMICA DA INSTALAÇÃO DE PAINÉIS FOTOVOLTAICOS EM GRANDES CONSUMIDORES

ANDRES MAURICIO CESPEDES GARAVITO 25 May 2018 (has links)
[pt] A geração distribuída (GD) vem crescendo nos últimos anos no Brasil, particularmente a geração fotovoltaica, permitindo a pequenos e grandes consumidores ter um papel ativo no sistema elétrico, podendo investir em um sistema próprio de geração. Para os consumidores cativos, além da redução do custo de energia, o consumidor também pode ter uma redução no custo de demanda, que é calculado a partir de um contrato com a distribuidora que o atende. Assim, considerando a possibilidade de instalação de painéis fotovoltaicos, o desafio dos consumidores é estimar com maior acurácia possível sua energia, a energia gerada pelos painéis e as demandas máximas futuras de forma a determinar a quantidade ótima de painéis, bem como o contrato de demanda com a distribuidora. Nesta dissertação, propõe-se resolver este problema a partir da simulação de cenários futuros de consumo de energia, demanda máxima e correlacionando-os com cenários futuros de geração de energia. Em seguida, a partir de um modelo de otimização linear inteiro misto, calcula-se a quantidade ótima de painéis fotovoltaicos e a demanda a ser contratada. Na primeira parte da dissertação, a modelagem Box e Jenkins é utilizada para estimar os parâmetros do modelo estatístico de energia consumida e demanda combinados com a geração de energia dos painéis. Na segunda parte, é utilizado um modelo de otimização estocástica que utiliza uma combinação convexa de Valor Esperado (VE) e Conditional Value-at-Risk (CVaR) como métricas de risco para avaliar o número ótimo de painéis e a melhor contratação de demanda. Para ilustrar a abordagem proposta, é apresentado um caso de estudo real para um grande consumidor considerado na modalidade Verde A4 no Ambiente de Contratação Regulado. Os resultados obtidos mostraram que a utilização de painéis fotovoltaicos em um grande consumidor reduzem o custo anual de energia em até 20 por cento, comparado com o valor real faturado. / [en] Distributed Generation (GD) is growing up in the last years in Brazil, particularly photovoltaic generation, allowing small and large consumers play an important role in the electric system, investing in a own generation system. For the regulated consumers, besides the reduction of energy cost, they also may have a reduction in demand cost, which is computed from peak demand contract with the supply utility company. Therefore, taking into account the possibility of photovoltaic panels installation, the challenge of consumers is estimate with highest accuracy as possible its energy, the energy generation by the panels, and the future peak demand in order to estimate the optimum quantity of panels, as well as the peak demand contract with the utility. A way to solve this problem is to simulate future scenarios of energy consumption, peak demand, and correlate them with future scenarios of energy generation. After that, from a mixed integer linear stochastic optimization model, the optimum quantity of panels and peak demand to be contracted are computed. In the first part, the Box and Jenkins modelling is used to estimate the parameters of the energy consumption and peak demand by statistical model, combined with the energy generation of the panels. In the second part, a stochastic optimization model is applied using a convex combination of the Expected Value (VE) and Conditional Value-at-Risk (CVaR), which were used as risk metrics to rate the optimum number of panels and the best peak demand contract. To illustrate the proposed approach, a real case study of a large consumer presented considering the Green Tariff group A4 in the Regulated Environment. The results show that to use photovoltaic panels can reduce the annual cost by up to 20 per cent, compared with the billed real value.
414

Simulações Financeiras em GPU / Finance and Stochastic Simulation on GPU

Thársis Tuani Pinto Souza 26 April 2013 (has links)
É muito comum modelar problemas em finanças com processos estocásticos, dada a incerteza de suas variáveis de análise. Além disso, problemas reais nesse domínio são, em geral, de grande custo computacional, o que sugere a utilização de plataformas de alto desempenho (HPC) em sua implementação. As novas gerações de arquitetura de hardware gráfico (GPU) possibilitam a programação de propósito geral enquanto mantêm alta banda de memória e grande poder computacional. Assim, esse tipo de arquitetura vem se mostrando como uma excelente alternativa em HPC. Com isso, a proposta principal desse trabalho é estudar o ferramental matemático e computacional necessário para modelagem estocástica em finanças com a utilização de GPUs como plataforma de aceleração. Para isso, apresentamos a GPU como uma plataforma de computação de propósito geral. Em seguida, analisamos uma variedade de geradores de números aleatórios, tanto em arquitetura sequencial quanto paralela. Além disso, apresentamos os conceitos fundamentais de Cálculo Estocástico e de método de Monte Carlo para simulação estocástica em finanças. Ao final, apresentamos dois estudos de casos de problemas em finanças: \"Stops Ótimos\" e \"Cálculo de Risco de Mercado\". No primeiro caso, resolvemos o problema de otimização de obtenção do ganho ótimo em uma estratégia de negociação de ações de \"Stop Gain\". A solução proposta é escalável e de paralelização inerente em GPU. Para o segundo caso, propomos um algoritmo paralelo para cálculo de risco de mercado, bem como técnicas para melhorar a solução obtida. Nos nossos experimentos, houve uma melhora de 4 vezes na qualidade da simulação estocástica e uma aceleração de mais de 50 vezes. / Given the uncertainty of their variables, it is common to model financial problems with stochastic processes. Furthermore, real problems in this area have a high computational cost. This suggests the use of High Performance Computing (HPC) to handle them. New generations of graphics hardware (GPU) enable general purpose computing while maintaining high memory bandwidth and large computing power. Therefore, this type of architecture is an excellent alternative in HPC and comptutational finance. The main purpose of this work is to study the computational and mathematical tools needed for stochastic modeling in finance using GPUs. We present GPUs as a platform for general purpose computing. We then analyze a variety of random number generators, both in sequential and parallel architectures, and introduce the fundamental mathematical tools for Stochastic Calculus and Monte Carlo simulation. With this background, we present two case studies in finance: ``Optimal Trading Stops\'\' and ``Market Risk Management\'\'. In the first case, we solve the problem of obtaining the optimal gain on a stock trading strategy of ``Stop Gain\'\'. The proposed solution is scalable and with inherent parallelism on GPU. For the second case, we propose a parallel algorithm to compute market risk, as well as techniques for improving the quality of the solutions. In our experiments, there was a 4 times improvement in the quality of stochastic simulation and an acceleration of over 50 times.
415

Direct optimization of dose-volume histogram metrics in intensity modulated radiation therapy treatment planning / Direkt optimering av dos-volym histogram-mått i intensitetsmodulerad strålterapiplanering

Zhang, Tianfang January 2018 (has links)
In optimization of intensity-modulated radiation therapy treatment plans, dose-volumehistogram (DVH) functions are often used as objective functions to minimize the violationof dose-volume criteria. Neither DVH functions nor dose-volume criteria, however,are ideal for gradient-based optimization as the former are not continuously differentiableand the latter are discontinuous functions of dose, apart from both beingnonconvex. In particular, DVH functions often work poorly when used in constraintsdue to their being identically zero when feasible and having vanishing gradients on theboundary of feasibility.In this work, we present a general mathematical framework allowing for direct optimizationon all DVH-based metrics. By regarding voxel doses as sample realizations ofan auxiliary random variable and using kernel density estimation to obtain explicit formulas,one arrives at formulations of volume-at-dose and dose-at-volume which are infinitelydifferentiable functions of dose. This is extended to DVH functions and so calledvolume-based DVH functions, as well as to min/max-dose functions and mean-tail-dosefunctions. Explicit expressions for evaluation of function values and corresponding gradientsare presented. The proposed framework has the advantages of depending on onlyone smoothness parameter, of approximation errors to conventional counterparts beingnegligible for practical purposes, and of a general consistency between derived functions.Numerical tests, which were performed for illustrative purposes, show that smoothdose-at-volume works better than quadratic penalties when used in constraints and thatsmooth DVH functions in certain cases have significant advantage over conventionalsuch. The results of this work have been successfully applied to lexicographic optimizationin a fluence map optimization setting. / Vid optimering av behandlingsplaner i intensitetsmodulerad strålterapi används dosvolym- histogram-funktioner (DVH-funktioner) ofta som målfunktioner för att minimera avståndet till dos-volymkriterier. Varken DVH-funktioner eller dos-volymkriterier är emellertid idealiska för gradientbaserad optimering då de förstnämnda inte är kontinuerligt deriverbara och de sistnämnda är diskontinuerliga funktioner av dos, samtidigt som båda också är ickekonvexa. Speciellt fungerar DVH-funktioner ofta dåligt i bivillkor då de är identiskt noll i tillåtna områden och har försvinnande gradienter på randen till tillåtenhet. I detta arbete presenteras ett generellt matematiskt ramverk som möjliggör direkt optimering på samtliga DVH-baserade mått. Genom att betrakta voxeldoser som stickprovsutfall från en stokastisk hjälpvariabel och använda ickeparametrisk densitetsskattning för att få explicita formler, kan måtten volume-at-dose och dose-at-volume formuleras som oändligt deriverbara funktioner av dos. Detta utökas till DVH-funktioner och så kallade volymbaserade DVH-funktioner, såväl som till mindos- och maxdosfunktioner och medelsvansdos-funktioner. Explicita uttryck för evaluering av funktionsvärden och tillhörande gradienter presenteras. Det föreslagna ramverket har fördelarna av att bero på endast en mjukhetsparameter, av att approximationsfelen till konventionella motsvarigheter är försumbara i praktiska sammanhang, och av en allmän konsistens mellan härledda funktioner. Numeriska tester genomförda i illustrativt syfte visar att slät dose-at-volume fungerar bättre än kvadratiska straff i bivillkor och att släta DVH-funktioner i vissa fall har betydlig fördel över konventionella sådana. Resultaten av detta arbete har med framgång applicerats på lexikografisk optimering inom fluensoptimering.
416

[pt] ANÁLISE ESTOCÁSTICA DA CONTRATAÇÃO DE ENERGIA ELÉTRICA DE GRANDES CONSUMIDORES NO AMBIENTE DE CONTRATAÇÃO LIVRE CONSIDERANDO CENÁRIOS CORRELACIONADOS DE PREÇOS DE CURTO PRAZO, ENERGIA E DEMANDA / [en] STOCHASTIC ANALYSIS OF ENERGY CONTRACTING IN THE FREE CONTRACT ENVIRONMENT FOR BIG CONSUMERS CONSIDERING CORRELATED SCENARIOS OF SPOT PRICES, ENERGY AND POWER DEMAND

DANIEL NIEMEYER TEIXEIRA PAULA 27 October 2020 (has links)
[pt] No Brasil, grandes consumidores podem estabelecer seus contratos de energia elétrica em dois ambientes: Ambiente de Contratação Regulado e Ambiente de Contratação Livre. Grandes consumidores são aqueles que possuem carga igual ou superior a 2 MW e podem ser atendidos sob contratos firmados em quaisquer um desses ambientes. Já os consumidores com demanda contratada inferior a 2 MW e superior a 500 kW podem ter seu contrato de energia estabelecido no Ambiente de Contratação Livre proveniente de geração de energia renovável ou no Ambiente de Contratação Regulada através das distribuidoras de energia. A principal vantagem do Ambiente de Contratação Livre é a possibilidade de negociar contratos com diferentes parâmetros, como, por exemplo, preço, quantidade de energia e prazo. Eventuais diferenças entre a energia contratada e a consumida, são liquidadas ao preço de energia de curto prazo, que pode ser bastante volátil.Neste caso o desafio é estabelecer uma estratégia de contratação que minimize os riscos associados a este ambiente. Esta dissertação propõe uma metodologia que envolve a simulação estatística de cenários correlacionados de energia, demanda máxima e preço de curto prazo (também chamado de PLD – Preço de Liquidação das Diferenças) para serem inseridos em um modelo matemático de otimização estocástica, que define os parâmetros ótimos da contratação de energia e demanda. Na parte estatística, um modelo Box e Jenkins é usado para estimar os parâmetros das séries históricas de energia e demanda máxima com o objetivo de simular cenários correlacionados com o PLD. Na parte de otimização, emprega-se uma combinação convexa entre Valor Esperado (VE) e Conditional Value-at-Risk (CVaR) como medidas de risco para encontrar os valores ótimos dos parâmetros contratuais, como a demanda máxima contratada, o volume mensal de energia a ser contratado, além das flexibilidades inferior e superior da energia contratada. Para ilustrar a abordagem proposta, essa metodologia é aplicada a um estudo de caso real para um grande consumidor no Ambiente de Contratação Livre. Os resultados indicaram que a metodologia proposta pode ser uma ferramenta eficiente para consumidores no Ambiente de Contratação Livre e, dado à natureza do modelo, pode ser generalizado para diferentes contratos e mercados de energia. / [en] In Brazil, big consumers can choose their energy contract between two different energy environments: Regulated Contract Environment and Free Contract Environment. Big consumers are characterized by installed load capacity equal or greater than 2 MW and can firm an energy contract under any of these environments. For those consumers with installed load lower than 2 MW and higher than 500 kW, their energy contracts can be firmed in the Free Contract Environment using renewable energy generation or in the Regulated Contract Environment by local distribution companies. The main advantage of the Free Market Environment is the possibility of negotiating contracts with different parameters such as, for example, price, energy quantity and deadlines. Possible differences between contracted energy and consumed energy are settled by the spot price, which can be rather volatile. In this case, the challenge is to establish a contracting strategy that minimize the associated risks with this environment. This thesis proposes a methodology that involves statistical simulation of correlated energy, peak demand and Spot Price scenarios to be used in a stochastic optimization model that defines the optimal energy and demand contract parameters. In the statistical part, a Box and Jenkins model is used to estimate parameters for energy and peak demand in order to simulate scenarios correlated with Spot Price. In the optimization part, a convex combination of Expected Value (EV) and Conditional Value-at-Risk (CVaR) is used as risk measures to find the optimal contract parameters, such as the contracted peak demand, the seasonal energy contracted volumes, in addition to the upper and lower energy contracted bound. To illustrate this approach, this methodology is applied in a real case study for a big consumer with an active Free Market Environment contract. The results indicate that the proposed methodology can be a efficient tool for consumers in the Free Contract Environment and, due to the nature of the model, it can be generalized for different energy contracts and markets.
417

Simulation-Based Portfolio Optimization with Coherent Distortion Risk Measures / Simuleringsbaserad portföljoptimering med koherenta distortionsriskmått

Prastorfer, Andreas January 2020 (has links)
This master's thesis studies portfolio optimization using linear programming algorithms. The contribution of this thesis is an extension of the convex framework for portfolio optimization with Conditional Value-at-Risk, introduced by Rockafeller and Uryasev. The extended framework considers risk measures in this thesis belonging to the intersecting classes of coherent risk measures and distortion risk measures, which are known as coherent distortion risk measures. The considered risk measures belonging to this class are the Conditional Value-at-Risk, the Wang Transform, the Block Maxima and the Dual Block Maxima measures. The extended portfolio optimization framework is applied to a reference portfolio consisting of stocks, options and a bond index. All assets are from the Swedish market. The returns of the assets in the reference portfolio are modelled with elliptical distribution and normal copulas with asymmetric marginal return distributions. The portfolio optimization framework is a simulation-based framework that measures the risk using the simulated scenarios from the assumed portfolio distribution model. To model the return data with asymmetric distributions, the tails of the marginal distributions are fitted with generalized Pareto distributions, and the dependence structure between the assets are captured using a normal copula. The result obtained from the optimizations is compared to different distributional return assumptions of the portfolio and the four risk measures. A Markowitz solution to the problem is computed using the mean average deviation as the risk measure. The solution is the benchmark solution which optimal solutions using the coherent distortion risk measures are compared to. The coherent distortion risk measures have the tractable property of being able to assign user-defined weights to different parts of the loss distribution and hence value increasing loss severities as greater risks. The user-defined loss weighting property and the asymmetric return distribution models are used to find optimal portfolios that account for extreme losses. An important finding of this project is that optimal solutions for asset returns simulated from asymmetric distributions are associated with greater risks, which is a consequence of more accurate modelling of distribution tails. Furthermore, weighting larger losses with increasingly larger weights show that the portfolio risk is greater, and a safer position is taken. / Denna masteruppsats behandlar portföljoptimering med linjära programmeringsalgoritmer. Bidraget av uppsatsen är en utvidgning av det konvexa ramverket för portföljoptimering med Conditional Value-at-Risk, som introducerades av Rockafeller och Uryasev. Det utvidgade ramverket behandlar riskmått som tillhör en sammansättning av den koherenta riskmåttklassen och distortions riksmåttklassen. Denna klass benämns som koherenta distortionsriskmått. De riskmått som tillhör denna klass och behandlas i uppsatsen och är Conditional Value-at-Risk, Wang Transformen, Block Maxima och Dual Block Maxima måtten. Det utvidgade portföljoptimeringsramverket appliceras på en referensportfölj bestående av aktier, optioner och ett obligationsindex från den Svenska aktiemarknaden. Tillgångarnas avkastningar, i referens portföljen, modelleras med både elliptiska fördelningar och normal-copula med asymmetriska marginalfördelningar. Portföljoptimeringsramverket är ett simuleringsbaserat ramverk som mäter risk baserat på scenarion simulerade från fördelningsmodellen som antagits för portföljen. För att modellera tillgångarnas avkastningar med asymmetriska fördelningar modelleras marginalfördelningarnas svansar med generaliserade Paretofördelningar och en normal-copula modellerar det ömsesidiga beroendet mellan tillgångarna. Resultatet av portföljoptimeringarna jämförs sinsemellan för de olika portföljernas avkastningsantaganden och de fyra riskmåtten. Problemet löses även med Markowitz optimering där "mean average deviation" används som riskmått. Denna lösning kommer vara den "benchmarklösning" som kommer jämföras mot de optimala lösningarna vilka beräknas i optimeringen med de koherenta distortionsriskmåtten. Den speciella egenskapen hos de koherenta distortionsriskmåtten som gör det möjligt att ange användarspecificerade vikter vid olika delar av förlustfördelningen och kan därför värdera mer extrema förluster som större risker. Den användardefinerade viktningsegenskapen hos riskmåtten studeras i kombination med den asymmetriska fördelningsmodellen för att utforska portföljer som tar extrema förluster i beaktande. En viktig upptäckt är att optimala lösningar till avkastningar som är modellerade med asymmetriska fördelningar är associerade med ökad risk, vilket är en konsekvens av mer exakt modellering av tillgångarnas fördelningssvansar. En annan upptäckt är, om större vikter läggs på högre förluster så ökar portföljrisken och en säkrare portföljstrategi antas.
418

Evaluating Markov Chain Monte Carlo Methods for Estimating Systemic Risk Measures Using Vine Copulas / Utvärdering av Markov Chain Monte Carlo-metoder vid estimering av systemisk risk under portföljmodellering baserad på Vine Copulas

Guterstam, Rasmus, Trojenborg, Vidar January 2021 (has links)
This thesis attempts to evaluate the Markov Chain Monte Carlo (MCMC) methods Metropolis-Hastings (MH) and No-U-Turn Sampler (NUTS) to estimate systemic risk measures. The subject of analysis is an equity portfolio provided by a Nordic asset management firm, which is modelled using a vine copula. The evaluation considers three different crisis outcomes on a portfolio level, and the results are compared with a Monte Carlo (MC) benchmark. The MCMC samplers attempt to increase sampling efficiency by sampling from these crisis events directly, which is impossible for an MC sampler. The resulting systemic risk measures are evaluated both on the portfolio level as well as marginal level.  The results are divided. In part, the MCMC samplers proved to be efficient in terms of accepted samples, where NUTS outperformed MH. However, due to the practical implementation of the MCMC samplers and the vine copula model, the computational time required outweighed the gains in sampler efficiency - causing the MC sampler to outperform both MCMC samplers in certain settings. For NUTS, there seems to be great potential in the context of estimating systemic risk measures as it explores high-dimensional and multimodal joint distributions efficiently with low autocorrelation. It is concluded that asset management companies can benefit from both using vine copulas to model portfolio risk, as well as using MC or MCMC methods for evaluating systemic risk. However, for the MCMC samplers to be of practical relevance, it is recommended to further investigate efficient implementations of vine copulas in the context of MCMC sampling. / Detta examensarbete utvärderar Markov Chain Monte Carlo (MCMC)-metoderna No-U-Turn Sampler (NUTS) och Metropolis-Hastings (MH) vid uppskattning av systemiska riskmått. För att göra detta används en vine copula för att modellera en portfölj, baserad på empirisk data från ett nordiskt kapitalförvaltningsföretag. Metoderna utvärderas givet tre olika krishändelser och jämförs därefter med ett Monte Carlo (MC) benchmark. MCMC-metoderna försöker öka samplingseffektiviteten genom att simulera direkt från dessa krishändelser, vilket är omöjligt för en klassisk MC-metod. De resulterande systemiska riskmåtten utvärderas både på portföljnivå och på marginalnivå. Resultaten är delade. Dels visade sig MCMC-metoderna vara effektiva när det gäller accepterade samples där NUTS överträffade MH. Dock, med anledning av av den praktiska implementationen av MCMC-metoderna och vine copula modellen var beräkningstiden för hög trots effektiviteten hos metoden - vilket fick MC-metoden att överträffa de andra metoderna i givet dessa särskilda kontexter. När det kommer till att uppskatta systemiska riskmått finns det dock stor potential för NUTS eftersom metoden utforskar högdimensionella och multimodala sannolikhetsfördelningar effektivt med låg autokorrelation. Vi drar även slutsatsen att kapitalförvaltare kan dra nytta av att både använda riskmodeller baserade på vine copulas, samt använda MC- eller MCMC-metoder för att utvärdera systemisk risk. För att MCMC-metoderna ska vara av praktisk relevans rekommenderas det dock att framtida forskning görs där mer effektiva implementeringar av vine copula-baserade modeller görs i samband med MCMC-sampling.
419

Value at risk and expected shortfall : traditional measures and extreme value theory enhancements with a South African market application

Dicks, Anelda 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Accurate estimation of Value at Risk (VaR) and Expected Shortfall (ES) is critical in the management of extreme market risks. These risks occur with small probability, but the financial impacts could be large. Traditional models to estimate VaR and ES are investigated. Following usual practice, 99% 10 day VaR and ES measures are calculated. A comprehensive theoretical background is first provided and then the models are applied to the Africa Financials Index from 29/01/1996 to 30/04/2013. The models considered include independent, identically distributed (i.i.d.) models and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) stochastic volatility models. Extreme Value Theory (EVT) models that focus especially on extreme market returns are also investigated. For this, the Peaks Over Threshold (POT) approach to EVT is followed. For the calculation of VaR, various scaling methods from one day to ten days are considered and their performance evaluated. The GARCH models fail to converge during periods of extreme returns. During these periods, EVT forecast results may be used. As a novel approach, this study considers the augmentation of the GARCH models with EVT forecasts. The two-step procedure of pre-filtering with a GARCH model and then applying EVT, as suggested by McNeil (1999), is also investigated. This study identifies some of the practical issues in model fitting. It is shown that no single forecasting model is universally optimal and the choice will depend on the nature of the data. For this data series, the best approach was to augment the GARCH stochastic volatility models with EVT forecasts during periods where the first do not converge. Model performance is judged by the actual number of VaR and ES violations compared to the expected number. The expected number is taken as the number of return observations over the entire sample period, multiplied by 0.01 for 99% VaR and ES calculations. / AFRIKAANSE OPSOMMING: Akkurate beraming van Waarde op Risiko (Value at Risk) en Verwagte Tekort (Expected Shortfall) is krities vir die bestuur van ekstreme mark risiko’s. Hierdie risiko’s kom met klein waarskynlikheid voor, maar die finansiële impakte is potensieel groot. Tradisionele modelle om Waarde op Risiko en Verwagte Tekort te beraam, word ondersoek. In ooreenstemming met die algemene praktyk, word 99% 10 dag maatstawwe bereken. ‘n Omvattende teoretiese agtergrond word eers gegee en daarna word die modelle toegepas op die Africa Financials Index vanaf 29/01/1996 tot 30/04/2013. Die modelle wat oorweeg word sluit onafhanklike, identies verdeelde modelle en Veralgemeende Auto-regressiewe Voorwaardelike Heteroskedastiese (GARCH) stogastiese volatiliteitsmodelle in. Ekstreemwaarde Teorie modelle, wat spesifiek op ekstreme mark opbrengste fokus, word ook ondersoek. In hierdie verband word die Peaks Over Threshold (POT) benadering tot Ekstreemwaarde Teorie gevolg. Vir die berekening van Waarde op Risiko word verskillende skaleringsmetodes van een dag na tien dae oorweeg en die prestasie van elk word ge-evalueer. Die GARCH modelle konvergeer nie gedurende tydperke van ekstreme opbrengste nie. Gedurende hierdie tydperke, kan Ekstreemwaarde Teorie modelle gebruik word. As ‘n nuwe benadering oorweeg hierdie studie die aanvulling van die GARCH modelle met Ekstreemwaarde Teorie vooruitskattings. Die sogenaamde twee-stap prosedure wat voor-af filtrering met ‘n GARCH model behels, gevolg deur die toepassing van Ekstreemwaarde Teorie (soos voorgestel deur McNeil, 1999), word ook ondersoek. Hierdie studie identifiseer sommige van die praktiese probleme in model passing. Daar word gewys dat geen enkele vooruistkattingsmodel universeel optimaal is nie en die keuse van die model hang af van die aard van die data. Die beste benadering vir die data reeks wat in hierdie studie gebruik word, was om die GARCH stogastiese volatiliteitsmodelle met Ekstreemwaarde Teorie vooruitskattings aan te vul waar die voorafgenoemde nie konvergeer nie. Die prestasie van die modelle word beoordeel deur die werklike aantal Waarde op Risiko en Verwagte Tekort oortredings met die verwagte aantal te vergelyk. Die verwagte aantal word geneem as die aantal obrengste waargeneem oor die hele steekproefperiode, vermenigvuldig met 0.01 vir die 99% Waarde op Risiko en Verwagte Tekort berekeninge.
420

Essays in long memory : evidence from African stock markets

Thupayagale, Pako January 2010 (has links)
This thesis explores various aspects of long memory behaviour in African stock markets (ASMs). First, we examine long memory in both equity returns and volatility using the weak-form version of the efficient market hypothesis (EMH) as a criterion. The results show that these markets (largely) display a predictable component in returns; while evidence of long memory in volatility is mixed. In general, these findings contradict the precepts of the EMH and a variety of remedial policies are suggested. Next, we re-examine evidence of volatility persistence and long memory in light of the potential existence of neglected breaks in the stock return volatility data. Our results indicate that a failure to account for time-variation in the unconditional mean variance can lead to spurious conclusions. Furthermore, a modification of the GARCH model to allow for mean variation is introduced, which, generates improved volatility forecasts for a selection of ASMs. To further evaluate the quality of volatility forecasts we compare the performance of a number of long memory models against a variety of alternatives. The results generally suggest that over short horizons simple statistical models and the short memory GARCH models provide superior forecasts of volatility; while, at longer horizons, we find some evidence in favour of long memory models. However, the various model rankings are shown to be sensitive to the choice of error statistic used to assess the accuracy of the forecasts. Finally, a wide range of volatility forecasting models are evaluated in order to ascertain which method delivers the most accurate value-at-risk (VaR) estimates in the context of Basle risk framework. The results show that both asymmetric and long memory attributes are important considerations in delivering accurate VaR measures.

Page generated in 0.028 seconds