Spelling suggestions: "subject:"valueatrisk""
401 |
Market Risk Modelling Of Commodity Futures : Implementing commodity futures product type into Swedbanks risk systemLindqvist, Julia January 2024 (has links)
The risk management within a bank is an important part given its status as a pivotal component within the capital adequency framwork stipluated in the Basel Accords. To proficiently be assessing, monitoring and managing market risk that the bank undertakes is therefore a part of the daily activities at Swedbank. For the majority of the measures and models, the bank is employing a full revaluation approach, implying a revaluation of each position under diverse market conditions specified across various scenarios to estimate risk. Prior to this thesis, Swedbank has been missing the full revaluation approach for the product commodity futures in their portfolio. The commodity futures needs to be treated differently from other futures due to their underlying being a physical product being produced, stored and transported. To help Swedbank being able to calculate and measure a diversified set of risk measures for commodity futures with high accuracy and according to market practise and implement the valuation model with results closest to market practise into their risk system, various valuation models have been replicated and compared in Python. The focus has been on investigating different variations of a model derived from the theory of storage and no arbitrage (Cost of Carry model) as well as a more advanced model developed from a belief of mean reverting short-term prices and an uncertain long-term equilibrium price (Schwartz and Smith Two Factor model). These models were replicated on three different commodity types in Swedbanks portfolio, Wheat, Rapeseed and Gasoil, to determine which valuation model that could estimate prices closest to the real prices on the market. The findings revealed that one variation of the Cost of Carry model could be matched exactly to the mark-to-market price due to the real price being known. The Schwartz and Smith Two Factor model was clearly the second best model, estimating prices very well but not always exactly. The most suited model that could match the price exactly, was chosen to be implemented into the risk system of Swedbank and had identified risk factors as interest rate, exchange rate and underlying spot price. With VaR simulations shifting the chosen risk factors, it could be proved that the commodity futures are traded back-to-back since all positions were offsetting each other. Since Swedbank is an intermediary and the business is about providing access to the market for Swedbanks customers, the back-to-back trading was something that Swedbank assumed but earlier not could prove. Furthermore, the back testing revealed that the special characteristic convenience yield could potentially be considered a risk factor in the future and that it would be relevant if the business model of commodity futures at Swedbank would change. / Riskhanteringen inom en bank är en viktig del med tanke på dess roll som en avgörande komponent inom kapitaltäckningsramverket som föreskrivs i Basel-avtalen. Att noggrant bedöma, övervaka och hantera den marknadsrisk som banken åtar sig är därför en del av de dagliga aktiviteterna på Swedbank. För de flesta åtgärder och modeller använder banken en fullständig omvärderingsmetod, vilket innebär en omvärdering av varje position under olika marknadsförhållanden specificerade över olika scenarier för att uppskatta risken. Innan det här projektet har Swedbank saknat den fullständiga omvärderingsmetoden för produkten råvaruterminer i sin portfölj. Råvaruterminer måste behandlas annorlunda än andra terminer på grund av att deras underliggande är en fysisk produkt som produceras, lagras och transporteras. För att hjälpa Swedbank att kunna beräkna och mäta en diversifierad uppsättning riskmått för råvaruterminer med hög noggrannhet och enligt marknadspraxis samt implementera värderingsmodellen med resultat som ligger närmast marknadspraxis i deras risksystem har olika värderingsmodeller replikerats och jämförts i Python. Fokuset har legat på att undersöka olika variationer av en modell som härstammar från teorin om lagring och inget arbitrage (Cost of Carry-modell) samt en mer avancerad modell som utvecklats från en tro om ett genomsnittligt återgående kortsiktigt pris och ett osäkert långsiktigt jämviktspris (Schwartz och Smith Two Factor-modell). Dessa modeller replikerades för tre olika typer av råvaror i Swedbanks portfölj: Vete, Raps och Gasol, för att avgöra vilken värderingsmodell som kunde uppskatta priser närmast de verkliga priserna på marknaden. Resultaten visade att en variation av Cost of Carry-modellen kunde matchas exakt med marknadsvärdet eftersom det verkliga priset var känt. Schwartz och Smith Two Factor-modellen var tydligt den näst bästa modellen, vilket uppskattade priserna mycket bra men inte alltid exakt. Den mest lämpade modellen som kunde matcha priset exakt valdes för att implementeras i Swedbanks risksystem och hade identifierade riskfaktorer som ränta, växelkurs och underliggande spotpris. Genom VaR-simuleringar som skiftade de valda riskfaktorerna kunde det bevisas att råvaruterminerna handlas back-to-back eftersom alla positioner neutraliserade varandra. Eftersom Swedbank är en mellanhand och affärsmodellen handlar om att ge Swedbanks kunder tillgång till marknaden, var back-to-back-handel något som Swedbank antog men tidigare inte kunde bevisa. Vidare visade backtestingen att den särskilda karaktären convenience yield eventuellt skulle kunna betraktas som en riskfaktor i framtiden och att detta skulle vara aktuellt om affärsmodellen för råvaruterminer på Swedbank skulle förändras.
|
402 |
Novel Approaches for Some Stochastic and Deterministic Scheduling ProblemsLiao, Lingrui 01 July 2011 (has links)
In this dissertation, we develop novel approaches to independently address two issues that are commonly encountered in machine scheduling problems: uncertainty of problem parameters (in particular, due to job processing times), and batching of jobs for processing on capacitated machines.
Our approach to address the uncertainty issue regards the indeterminate parameters as random variables, and explicitly considers the resulting variability of a performance measure. To incorporate variability into the schedule selection process, we develop a method to evaluate both the expectation and variance of various performance measures for a given schedule. Our method is based on the use of mixture models to approximate a variety of distribution types. The Expectation-Maximization algorithm of Dempster et al. (1977) is applied to derive mixture models of processing time distributions. Our method, then, utilizes these mixture models to calculate the distributions of other random variables in order to derive the expectation and variance of various scheduling performance measures, assuming that the job sequencing decisions are known a priori. To make our method more computationally efficient, we adapt a mixture reduction method to control the number of mixture components used in the intermediate steps. We apply our method to two different scheduling problems: the job shop makespan scheduling problem and the single machine total weighted tardiness scheduling problem, and compare its performance with that of Monte-Carlo method. The results show the efficacy of our mixture approximation method. It generates fairly accurate results while requiring significantly less CPU times. The proposed method offers a good compromise between the Monte Carlo method, which requires extensive effort, and use of simple normal approximation, which produces lower-quality results.
Next, we introduce and demonstrate for the first time in the literature the use of conditional-value-at-risk (CVaR) as a criterion for stochastic scheduling problems in order to obtain risk-averse solutions. This criterion has the tendency of minimizing both the expectation and variance of a performance measure simultaneously, which is an attractive feature in the scheduling area as most of the literature in this area considers the expectation and variance of a performance measure separately. Also, the CVaR has an added advantage of maintaining a linear objective function. We develop a scenario-based mixed integer programming formulation to minimize CVaR for the general scheduling problem involving various performance measures, and employ a decomposition-based approach for its solution. Furthermore, a set of valid inequalities are incorporated to strengthen the relaxed master problem of this decomposition scheme. The proposed approach is demonstrated on the single machine total weighted tardiness scheduling problem. Our computational investigation reveals the efficacy of the proposed decomposition approach and the effectiveness of using the CVaR as an optimization criterion for scheduling problems. Besides providing an exact approach to solve our stochastic scheduling problem, we also develop an efficient heuristic method to enable the use of CVaR for large-sized problems. To that end, we modify the Dynasearch method of Grosso et al. (2004) to minimize CVaR for a stochastic scheduling problem. Furthermore, we extend the application of CVaR to a parallel-machine total weighted tardiness problem. The use of CVaR appears to be quite promising for simultaneously controlling both the expected value and variability of a performance measure in a stochastic scheduling environment.
Scenario-based formulations have frequently been used for stochastic scheduling problems. However, the determination of a lower bound can be a time-consuming task for this approach. Next, we develop a new method for scenario generation that is computationally competitive and that assures attainment of an exact lower bound. Our approach is based on discretization of random parameter distributions of job processing times. We use the idea of Recursive Stratified Sampling to partition the probability space, so that the conditional expectations in each region yield scenario-wise parameter values. These scenarios are, then, used to formulate a two-stage stochastic program, which yields a lower bound for the original stochastic problem. We provide theoretical basis of our bounding approach for both the expectation and CVaR objectives. Our discrete bounding method generates exact lower bounds, as against the probabilistic bounds generated by Sample Average Approximation. We also present results of our numerical experimentation to compare the performances of these two approaches in terms of the bound value obtained and the CPU time required.
The problem pertaining to integrated batching and scheduling of jobs on capacitated parallel machines that we consider arises in the primary manufacturing sector of a pharmaceutical supply chain. We, first, develop a comprehensive mathematical programming model that can accommodate various realistic features of this problem. These features include batch production, sequence-dependent setup time/cost, and inter-period carryover of setup status. We further derive several valid inequalities that are based on the embedded subproblem structure. We also consider an alternative formulation (termed the Plant Location model) based on the lot-sizing perspective of the problem. Noting the resemblance of the campaign sequencing subproblem to the high multiplicity asymmetric traveling salesman problem (HMATSP), we adapt various ideas from the HMATSP to enforce the connectivity of the sequencing graph. Due to the complexity of this problem, we also explore the possibility of applying column generation technique for its solution. Various schemes of problem decomposition are considered, along with the use of dual stabilization technique to improve the convergence of the column generation procedure. We also develop heuristic methods to generate initial feasible solutions that further enhance the performance of the column generation method. A computational experimentation has been conducted on a data set that mimics real-life problem instances. It illustrates the effectiveness of using the proposed column generation method. / Ph. D.
|
403 |
[en] ANALYSIS OF EXTREME VALUES THEORY AND MONTE CARLO SIMULATION FOR THE CALCULATION OF VALUE-AT-RISK IN STOCK PORTFOLIOS / [pt] ANÁLISE DA TEORIA DOS VALORES EXTREMOS E DA SIMULAÇÃO DE MONTE CARLO PARA O CÁLCULO DO VALUE-AT-RISK EM CARTEIRAS DE INVESTIMENTOS DE ATIVOS DE RENDA VARIÁVELGUSTAVO JARDIM DE MORAIS 16 July 2018 (has links)
[pt] Após as recentes crises financeiras que se abateram sobre os mercados financeiros de todo o mundo, com mais propriedade a de 2008/2009, mas ainda a crise no Leste Europeu em Julho/2007, a moratória Russa em Outubro/1998, e, no âmbito nacional, a mudança no regime cambial brasileiro, em Janeiro/1999, as instituições financeiras incorreram em grandes perdas em cada um desses eventos e uma das principais questões levantadas acerca dos modelos financeiros diziam respeito ao gerenciamento de risco. Os diversos métodos de cálculo do Value-atrisk, bem como as simulações e cenários traçados por analistas não puderam prever sua magnitude nem tampouco evitar que a crise se agravasse. Em função disso, proponho-me à questão de estudar os sistemas de gerenciamento de risco financeiro, na medida em que este pode e deve ser aprimorado, sob pena de catástrofes financeiras ainda maiores. Embora seu conteúdo se mostre tão vasto na literatura, as metodologias para cálculo de valor em risco não são exatas e livres de falhas. Nesse contexto, coloca-se necessário o desenvolvimento e aprimoramento de ferramentas de gestão de risco que sejam capazes de auxiliar na melhor alocação dos recursos disponíveis, avaliando o nível de risco à que um investimento está exposto e sua compatibilidade com seu retorno esperado. / [en] After recent financial crisis that have hit financial markets all around the world, with more property on 2008/2009 periods, the Eastern Europe crisis in 2007, the Russian moratorium on October/1998, and with Brazilian national exchange rate regime change on January/1999, financial institutions have incurred
in large losses on each of these events and one of the main question raised about the financial models related to risk management. The Value-at-Risk management and its many forms to calculate it, as well as the simulations and scenarios predicted by analysts could not predict its magnitude or prevent crisis worsened. As a result, I intent to study the question of financial systems management, in order to improve the existing methods, under the threat that even bigger financial disasters are shall overcome. Although it s content is vast on scientific literature, the Value-at-Risk calculate is not exact and free of flaws. In this context, there is need for the development and improvement of risk management tools that are able to assist in a better asset equities allocation of resources, equalizing the risk level of an investment and it s return.
|
404 |
[pt] VALOR EM RISCO: UMA COMPARAÇÃO ENTRE MÉTODOS DE ESCOLHA DA FRAÇÃO AMOSTRAL NA ESTIMAÇÃO DO ÍNDICE DE CAUDA DE DISTRIBUIÇÕES GEV / [en] VALUE AT RISK: A COMPARISON OF METHODS TO CHOOSE THE SAMPLE FRACTION IN TAIL INDEX ESTIMATION OF GENERALIZED EXTREME VALUE DISTRIBUTIONCHRISTIAM MIGUEL GONZALES CHAVEZ 28 August 2002 (has links)
[pt] Valor em Risco -VaR- já é parte das ferramentas habituais
que um analista financeiro utiliza para estimar o risco
de mercado. Na implementação do VaR é necessário que seja
estimados quantis de baixa probabilidade para a
distribuição condicional dos retornos dos portfólios. A
metodologia tradicional para o cálculo do VaR requer a
estimação de um modelo tipo GARCH com distribuição normal.
Entretanto, a hipótese de normalidade condicional nem
sempre é adequada, principalmente quando se deseja
estimar o VaR em períodos atípicos, caracterizados pela
ocorrência de eventos extremos. Nesta situações a
distribuição condicional deve apresentar excesso de
curtose. O uso de distribuições derivadas do Teorema do
Valor Extremos -TVE-, conhecidas coletivamente como
GEV,associadas aos modelos tipo GARCH, tornou possível o
cálculo do VaR nestas situações.Um parâmetro chave nas
distribuições da família GEV é o índice de cauda, o qual
pode ser estimado através do estimador de Hill.
Entretanto este estimador apresenta muita sensibilidade
em termos de variância e viés com respeito à fração
amostral utilizada na sua estimação. O objetivo principal
desta dissertação foi fazer uma comparação entre três
métodos de escolha da fração amostral, recentemente
sugeridos na literatura: o método bootstrap duplo
Danielsson, de Haan, Peng e de Vries 1999, o método
threshold Guillou e Hall 2001 e o Hill plot alternativo
Drees, de Haan e Resnick 2000. A avaliação dos métodos
foi feita através do teste de cobertura condicional
de Christoffersen 1998, o qual foi aplicado às séries de
retornos dos índices: NASDAQ, NIKKEY,MERVAL e IBOVESPA.
Os nossos resultados indicam que os três métodos
apresentam aproximadamente o mesmo desempenho, com uma
ligeira vantagem dos métodos bootstrap duplo e o
threshold sobre o Hill plot alternativo, porque este
ultimo tem um componente normativo na determinação do
índice de cauda ótimo. / [en] Value at Risk -VaR- is already part of the toolkit of financial analysts assessing market risk. In order to implement VaR it is needed to estimate low quantiles of the portfolio returns distribution. Traditional methodologies combine a normal conditional distribution together with ARCH type models to accomplish this goal. Albeit well succeed in evaluating risk for typical periods, this methodology has not been able to accommodate events that occur with very low probabilities. For these situations one needs conditional distributions with excess of kurtosis. The use of distributions derived from the ExtremeValue Theory -EVT-, collectively known as Generalized Extreme Value distribution -GEV-, together with ARCH type models have made it possible to address this problem ina proper framework. A key parameter in the GEV distribution is the tail index, which can be estimated by Hill s estimator. Hill s estimator is very sensible, in terms of bias and RMSE, to the sample fraction that is used in its estimation. The objective of this dissertation is to compare three recently suggested methods presented in the statistical literature: the double bootstrap method Danielsson, de Haan, Peng and de Vries 1999,the threshold method Guillou and Hall 2001 and the alternative Hill plot Drees, de Haan and Resnick 2000. The methods have been evaluated with respect to the conditional coverage test of Christoffersen 1998, which has been applied to the followingreturns series : NASDAQ, NIKKEY, MERVAL e IBOVESPA. Our empirical findings suggests that, overall the three methods have the same performance, with some advantage of the bootstrap and threshold methods over the alternative Hill plot, which has a normative component in the determination of the optimal tail index.
|
405 |
Allocation dynamique de portefeuille avec profil de gain asymétrique : risk management, incitations financières et benchmarking / Dynamic asset allocation with asymmetric payoffs : risk management, financial incentives, and benchmarkingTergny, Guillaume 31 May 2011 (has links)
Les gérants de portefeuille pour compte de tiers sont souvent jugés par leur performance relative à celle d'un portefeuille benchmark. A ce titre, ils sont amenés très fréquemment à utiliser des modèles internes de "risk management" pour contrôler le risque de sous-performer le benchmark. Par ailleurs, ils sont de plus en plus nombreux à adopter une politique de rémunération incitative, en percevant une commission de sur-performance par rapport au benchmark. En effet, cette composante variable de leur rémunération leur permet d'augmenter leur revenu en cas de sur-performance sans contrepartie en cas de sous-performance. Or de telles pratiques ont fait récemment l'objet de nombreuses polémiques : la période récente de crise financière mondiale a fait apparaître certaines carences de plusieurs acteurs financiers en terme de contrôle de risque ainsi que des niveaux de prise de risque et de rémunération jugés excessifs. Cependant, l'étude des implications de ces pratiques reste un thème encore relativement peu exploré dans le cadre de la théorie classique des choix dynamiques de portefeuille en temps continu. Cette thèse analyse, dans ce cadre théorique, les implications de ces pratiques de "benchmarking" sur le comportement d'investissement de l'asset manager. La première partie étudie les propriétés de la stratégie dynamique optimale pour l'asset manager concerné par l'écart entre la rentabilité de son portefeuille et celle d'un benchmark fixe ou stochastique (sur ou sous-performance). Nous considérons plusieurs types d'asset managers, caractérisés par différentes fonctions d'utilité et qui sont soumis à différentes contraintes de risque de sous-performance. Nous montrons en particulier quel est le lien entre les problèmes d'investissement avec prise en compte de l'aversion à la sous-performance et avec contrainte explicite de "risk management". Dans la seconde partie, on s'intéresse à l'asset manager bénéficiant d'une rémunération incitative (frais de gestion variables, bonus de sur-performance ou commission sur encours additionnelle). On étudie, selon la forme de ses incitations financières et son degré d'aversion à la sous-performance, comment sa stratégie d'investissement s'écarte de celle de l'investisseur (ou celle de l'asset manager sans rémunération incitative). Nous montrons que le changement de comportement de l'asset manager peut se traduire soit par une réduction du risque pris par rapport à la stratégie sans incitation financière soit au contraire par une augmentation de celui-ci. Finalement, nous montrons en quoi la présence de contraintes de risque de sous-performance, imposées au gérant ou traduisant son aversion à la sous-performance, peut être bénéfique à l'investisseur donnant mandat de gestion financière. / It is common practice to judge third-party asset managers by looking at their financial performance relative to a benchmark portfolio. For this reason, they often choose to rely on internal risk-management models to control the downside risk of their portfolio relative to the benchmark. Moreover, an increasing number are adopting an incentive-based scheme, by charging an over-performance commission relative to the benchmark. Indeed, including this variable component in their global remuneration allows them to increase their revenue in case of over-performance without any penalty in the event of underperforming the benchmark. However, such practices have recently been at the heart of several polemics: the recent global financial crisis has uncovered some shortcomings in terms of internal risk control as well as excessive risk-taking and compensation levels of several financial players. Nevertheless, it appears that analyzing the impact of these practices remains a relatively new issue in continuous time-dynamic asset allocation theory. This thesis analyses in this theoretical framework the implications of these "benchmarking" practices on the asset manager's investment behavior. The first part examines the properties of the optimal dynamic strategy for the asset manager who is concerned by the difference of return between their portfolio and a fix or stochastic benchmark (over- or under-performance). Several asset manager types are considered, defined by different utility functions and different downside-risk constraints. In particular, the link between investment problems with aversion to under-performance and risk management constraints is shown. In the second part, the case of the asset manager who benefits from an incentive compensation scheme (variable asset management fees, over-performance bonuses or additional commission on asset under management), is investigated. We study how, depending on the choice of financial inventive structure and loss aversion level, the asset manager's strategy differs from that of the investor (or the strategy of the asset manager receiving no incentive remuneration). This study shows that the change in investment behavior of the asset manager can lead to both a reduction in the risk taken relative to the strategy without financial incentives or conversely an increase thereof. Finally we show that the existence of downside risk constraints, imposed on the asset manager or corresponding to their aversion for under-performance, can be beneficial to the investor mandating financial management.
|
406 |
Value at risk et expected shortfall pour des données faiblement dépendantes : estimations non-paramétriques et théorèmes de convergences / Value at risk and expected shortfall for weak dependent random variables : nonparametric estimations and limit theoremsKabui, Ali 19 September 2012 (has links)
Quantifier et mesurer le risque dans un environnement partiellement ou totalement incertain est probablement l'un des enjeux majeurs de la recherche appliquée en mathématiques financières. Cela concerne l'économie, la finance, mais d'autres domaines comme la santé via les assurances par exemple. L'une des difficultés fondamentales de ce processus de gestion des risques est de modéliser les actifs sous-jacents, puis d'approcher le risque à partir des observations ou des simulations. Comme dans ce domaine, l'aléa ou l'incertitude joue un rôle fondamental dans l'évolution des actifs, le recours aux processus stochastiques et aux méthodes statistiques devient crucial. Dans la pratique l'approche paramétrique est largement utilisée. Elle consiste à choisir le modèle dans une famille paramétrique, de quantifier le risque en fonction des paramètres, et d'estimer le risque en remplaçant les paramètres par leurs estimations. Cette approche présente un risque majeur, celui de mal spécifier le modèle, et donc de sous-estimer ou sur-estimer le risque. Partant de ce constat et dans une perspective de minimiser le risque de modèle, nous avons choisi d'aborder la question de la quantification du risque avec une approche non-paramétrique qui s'applique à des modèles aussi généraux que possible. Nous nous sommes concentrés sur deux mesures de risque largement utilisées dans la pratique et qui sont parfois imposées par les réglementations nationales ou internationales. Il s'agit de la Value at Risk (VaR) qui quantifie le niveau de perte maximum avec un niveau de confiance élevé (95% ou 99%). La seconde mesure est l'Expected Shortfall (ES) qui nous renseigne sur la perte moyenne au delà de la VaR. / To quantify and measure the risk in an environment partially or completely uncertain is probably one of the major issues of the applied research in financial mathematics. That relates to the economy, finance, but many other fields like health via the insurances for example. One of the fundamental difficulties of this process of management of risks is to model the under lying credits, then approach the risk from observations or simulations. As in this field, the risk or uncertainty plays a fundamental role in the evolution of the credits; the recourse to the stochastic processes and with the statistical methods becomes crucial. In practice the parametric approach is largely used.It consists in choosing the model in a parametric family, to quantify the risk according to the parameters, and to estimate its risk by replacing the parameters by their estimates. This approach presents a main risk, that badly to specify the model, and thus to underestimate or over-estimate the risk. Based within and with a view to minimizing the risk model, we choose to tackle the question of the quantification of the risk with a nonparametric approach which applies to models as general as possible. We concentrate to two measures of risk largely used in practice and which are sometimes imposed by the national or international regulations. They are the Value at Risk (VaR) which quantifies the maximum level of loss with a high degree of confidence (95% or 99%). The second measure is the Expected Shortfall (ES) which informs about the average loss beyond the VaR.
|
407 |
Risk preferences and their robust representationDrapeau, Samuel 16 June 2010 (has links)
Ziel dieser Dissertation ist es, den Begriff des Risikos unter den Aspekten seiner Quantifizierung durch robuste Darstellungen zu untersuchen. In einem ersten Teil wird Risiko anhand Kontext-Invarianter Merkmale betrachtet: Diversifizierung und Monotonie. Wir führen die drei Schlüsselkonzepte, Risikoordnung, Risikomaß und Risikoakzeptanzfamilen ein, und studieren deren eins-zu-eins Beziehung. Unser Hauptresultat stellt eine eindeutige duale robuste Darstellung jedes unterhalbstetigen Risikomaßes auf topologischen Vektorräumen her. Wir zeigen auch automatische Stetigkeitsergebnisse und robuste Darstellungen für Risikomaße auf diversen Arten von konvexen Mengen. Diese Herangehensweise lässt bei der Wahl der konvexen Menge viel Spielraum, und erlaubt damit eine Vielfalt von Interpretationen von Risiko: Modellrisiko im Falle von Zufallsvariablen, Verteilungsrisiko im Falle von Lotterien, Abdiskontierungsrisiko im Falle von Konsumströmen... Diverse Beispiele sind dann in diesen verschiedenen Situationen explizit berechnet (Sicherheitsäquivalent, ökonomischer Risikoindex, VaR für Lotterien, "variational preferences"...). Im zweiten Teil, betrachten wir Präferenzordnungen, die möglicherweise zusätzliche Informationen benötigen, um ausgedrückt zu werden. Hierzu führen wir einen axiomatischen Rahmen in Form von bedingten Präferenzordungen ein, die lokal mit der Information kompatibel sind. Dies erlaubt die Konstruktion einer bedingten numerischen Darstellung. Wir erhalten eine bedingte Variante der von Neumann und Morgenstern Darstellung für messbare stochastische Kerne und erweitern dieses Ergebnis zur einer bedingten Version der "variational preferences". Abschließend, klären wir das Zusammenpiel zwischen Modellrisiko und Verteilungsrisiko auf der axiomatischen Ebene. / The goal of this thesis is the conceptual study of risk and its quantification via robust representations. We concentrate in a first part on context invariant features related to this notion: diversification and monotonicity. We introduce and study the general properties of three key concepts, risk order, risk measure and risk acceptance family and their one-to-one relations. Our main result is a uniquely characterized dual robust representation of lower semicontinuous risk orders on topological vector space. We also provide automatic continuity and robust representation results on specific convex sets. This approach allows multiple interpretation of risk depending on the setting: model risk in the case of random variables, distributional risk in the case of lotteries, discounting risk in the case of consumption streams... Various explicit computations in those different settings are then treated (economic index of riskiness, certainty equivalent, VaR on lotteries, variational preferences...). In the second part, we consider preferences which might require additional information in order to be expressed. We provide a mathematical framework for this idea in terms of preorders, called conditional preference orders, which are locally compatible with the available information. This allows us to construct conditional numerical representations of conditional preferences. We obtain a conditional version of the von Neumann and Morgenstern representation for measurable stochastic kernels and extend then to a conditional version of the variational preferences. We finally clarify the interplay between model risk and distributional risk on the axiomatic level.
|
408 |
Simulações Financeiras em GPU / Finance and Stochastic Simulation on GPUSouza, Thársis Tuani Pinto 26 April 2013 (has links)
É muito comum modelar problemas em finanças com processos estocásticos, dada a incerteza de suas variáveis de análise. Além disso, problemas reais nesse domínio são, em geral, de grande custo computacional, o que sugere a utilização de plataformas de alto desempenho (HPC) em sua implementação. As novas gerações de arquitetura de hardware gráfico (GPU) possibilitam a programação de propósito geral enquanto mantêm alta banda de memória e grande poder computacional. Assim, esse tipo de arquitetura vem se mostrando como uma excelente alternativa em HPC. Com isso, a proposta principal desse trabalho é estudar o ferramental matemático e computacional necessário para modelagem estocástica em finanças com a utilização de GPUs como plataforma de aceleração. Para isso, apresentamos a GPU como uma plataforma de computação de propósito geral. Em seguida, analisamos uma variedade de geradores de números aleatórios, tanto em arquitetura sequencial quanto paralela. Além disso, apresentamos os conceitos fundamentais de Cálculo Estocástico e de método de Monte Carlo para simulação estocástica em finanças. Ao final, apresentamos dois estudos de casos de problemas em finanças: \"Stops Ótimos\" e \"Cálculo de Risco de Mercado\". No primeiro caso, resolvemos o problema de otimização de obtenção do ganho ótimo em uma estratégia de negociação de ações de \"Stop Gain\". A solução proposta é escalável e de paralelização inerente em GPU. Para o segundo caso, propomos um algoritmo paralelo para cálculo de risco de mercado, bem como técnicas para melhorar a solução obtida. Nos nossos experimentos, houve uma melhora de 4 vezes na qualidade da simulação estocástica e uma aceleração de mais de 50 vezes. / Given the uncertainty of their variables, it is common to model financial problems with stochastic processes. Furthermore, real problems in this area have a high computational cost. This suggests the use of High Performance Computing (HPC) to handle them. New generations of graphics hardware (GPU) enable general purpose computing while maintaining high memory bandwidth and large computing power. Therefore, this type of architecture is an excellent alternative in HPC and comptutational finance. The main purpose of this work is to study the computational and mathematical tools needed for stochastic modeling in finance using GPUs. We present GPUs as a platform for general purpose computing. We then analyze a variety of random number generators, both in sequential and parallel architectures, and introduce the fundamental mathematical tools for Stochastic Calculus and Monte Carlo simulation. With this background, we present two case studies in finance: ``Optimal Trading Stops\'\' and ``Market Risk Management\'\'. In the first case, we solve the problem of obtaining the optimal gain on a stock trading strategy of ``Stop Gain\'\'. The proposed solution is scalable and with inherent parallelism on GPU. For the second case, we propose a parallel algorithm to compute market risk, as well as techniques for improving the quality of the solutions. In our experiments, there was a 4 times improvement in the quality of stochastic simulation and an acceleration of over 50 times.
|
409 |
保險公司因應死亡率風險之避險策略 / Hedging strategy against mortality risk for insurance company莊晉國, Chuang, Chin Kuo Unknown Date (has links)
本篇論文主要討論在死亡率改善不確定性之下的避險策略。當保險公司負債面的人壽保單是比年金商品來得多的時候,公司會處於死亡率的風險之下。我們假設死亡率和利率都是隨機的情況,部分的死亡率風險可以經由自然避險而消除,而剩下的死亡率風險和利率風險則由零息債券和保單貼現商品來達到最適避險效果。我們考慮mean variance、VaR和CTE當成目標函數時的避險策略,其中在mean variance的最適避險策略可以導出公式解。由數值結果我們可以得知保單貼現的確是死亡率風險的有效避險工具。 / This paper proposes hedging strategies to deal with the uncertainty of mortality improvement. When insurance company has more life insurance contracts than annuities in the liability, it will be under the exposure of mortality risk. We assume both mortality and interest rate risk are stochastic. Part of mortality risk is eliminated by natural hedging and the remaining mortality risk and interest rate risk will be optimally hedged by zero coupon bond and life settlement contract. We consider the hedging strategies with objective functions of mean variance, value at risk and conditional tail expectation. The closed-form optimal hedging formula for mean variance assumption is derived, and the numerical result show the life settlement is indeed a effective hedging instrument against mortality risk.
|
410 |
Risks in Commodity and Currency MarketsBozovic, Milos 17 April 2009 (has links)
This thesis analyzes market risk factors in commodity and currency markets. It focuses on the impact of extreme events on the prices of financial products traded in these markets, and on the overall market risk faced by the investors. The first chapter develops a simple two-factor jump-diffusion model for valuation of contingent claims on commodities in order to investigate the pricing implications of shocks that are exogenous to this market. The second chapter analyzes the nature and pricing implications of the abrupt changes in exchange rates, as well as the ability of these changes to explain the shapes of option-implied volatility "smiles". Finally, the third chapter employs the notion that key results of the univariate extreme value theory can be applied separately to the principal components of ARMA-GARCH residuals of a multivariate return series. The proposed approach yields more precise Value at Risk forecasts than conventional multivariate methods, while maintaining the same efficiency. / El objetivo de esta tesis es analizar los factores del riesgo del mercado de las materias primas y las divisas. Está centrada en el impacto de los eventos extremos tanto en los precios de los productos financieros como en el riesgo total de mercado al cual se enfrentan los inversores. En el primer capítulo se introduce un modelo simple de difusión y saltos (jump-diffusion) con dos factores para la valuación de activos contingentes sobre las materias primas, con el objetivo de investigar las implicaciones de shocks en los precios que son exógenos a este mercado. En el segundo capítulo se analiza la naturaleza e implicaciones para la valuación de los saltos en los tipos de cambio, así como la capacidad de éstos para explicar las formas de sonrisa en la volatilidad implicada. Por último, en el tercer capítulo se utiliza la idea de que los resultados principales de la Teoria de Valores Extremos univariada se pueden aplicar por separado a los componentes principales de los residuos de un modelo ARMA-GARCH de series multivariadas de retorno. El enfoque propuesto produce pronósticos de Value at Risk más precisos que los convencionales métodos multivariados, manteniendo la misma eficiencia.
|
Page generated in 0.0367 seconds