Spelling suggestions: "subject:"sannolikhetsteori"" "subject:"sannolikhetsteorin""
161 |
Pricing Inflation Derivatives : A survey of short rate- and market modelsTewolde Berhan, Damr January 2012 (has links)
This thesis presents an overview of strategies for pricing inflation derivatives. The paper is structured as follows. Firstly, the basic definitions and concepts such as nominal-, real- and inflation rates are introduced. We introduce the benchmark contracts of the inflation derivatives market, and using standard results from no-arbitrage pricing theory, derive pricing formulas for linear contracts on inflation. In addition, the risk profile of inflation contracts is illustrated and we highlight how it’s captured in the models to be studied in the paper. We then move on to the main objective of the thesis and present three approaches for pricing inflation derivatives, where we focus in particular on two popular models. The first one, is a so called HJM approach, that models the nominal and real forward curves and relates the two by making an analogy to domestic and foreign fx rates. By the choice of volatility functions in the HJM framework, we produce nominal and real term structures similar to the popular interest-rate derivatives model of Hull-White. This approach was first suggested by Jarrow and Yildirim[1] and it’s main attractiveness lies in that it results in analytic pricing formulas for both linear and non-linear benchmark inflation derivatives. The second approach, is a so called market model, independently proposed by Mercurio[2] and Belgrade, Benhamou, and Koehler[4]. Just like the - famous - Libor Market Model, the modeled quantities are observable market entities, namely, the respective forward inflation indices. It is shown how this model as well - by the use of certain approximations - can produce analytic formulas for both linear and non-linear benchmark inflation derivatives. The advantages and shortcomings of the respective models are eveluated. In particular, we focus on how well the models calibrate to market data. To this end, model parameters are calibrated to market prices of year-on-year inflation floors; and it is evaluated how well market prices can be recovered by theoretical pricing with the calibrated model parameters. The thesis is concluded with suggestions for possible extensions and improvements.
|
162 |
The Market Graph : A study of its characteristics, structure & dynamics / Marknadsgrafen : En studie av dess karakteristika, struktur & dynamikBudai, Daniel, Jallo, David January 2011 (has links)
In this thesis we have considered three different market graphs; one solely based on stock returns, another one based on stock returns with vertices weighted with a liquidity measure and lastly one based on correlations of volume fluctuations. Research is conducted on two different markets; the Swedish and the American stock market. We want to introduce graph theory as a method for representing the stock market in order to show that one can more fully understand the structural properties and dynamics of the stock market by studying the market graph. We found many signs of increased globalization by studying the clustering coefficient and the correlation distribution. The structure of the market graph is such that it pinpoints specific sectors when the correlation threshold is increased and different sectors are found in the two different markets. For low correlation thresholds we found groups of independent stocks that can be used as diversified portfolios. Furthermore, the dynamics revealed that it is possible to use the daily absolute change in edge density as an indicator for when the market is about to make a downturn. This could be an interesting topic for further studies. We had hoped to get additional results by considering volume correlations, but that did not turn out to be the case. Regardless of that, we think that it would be interesting to study volume based market graphs further. / I denna uppsats har vi tittat på tre olika marknadsgrafer; en enbart baserad på avkastning, en baserad på avkastning med likvidviktade noder och slutligen en baserad på volymkorrelationer. Studien är gjord på två olika marknader; den svenska och den amerikanska aktiemarknaden. Vi vill introducera grafteori som ett verktyg för att representera aktiemarknaden och visa att man bättre kan förstå aktiemarknadens strukturerade egenskaper och dynamik genom att studera marknadsgrafen. Vi fann många tecken på en ökad globalisering genom att titta på klusterkoefficienten och korrelationsfördelningen. Marknadsgrafens struktur är så att den lokaliserar specifika sektorer när korrelationstaket ökas och olika sektorer är funna för de två olika marknaderna. För låga korrelationstak fann vi grupper av oberoende aktier som kan användas som diversifierade portföljer. Vidare, avslöjar dynamiken att det är möjligt att använda daglig absolut förändring i bågdensiteten som en indikator för när marknaden är på väg att gå ner. Detta kan vara ett intressant ämne för vidare studier. Vi hade hoppats på att erhålla ytterligare resultat genom att titta på volymkorrelationer men det visade sig att så inte var fallet. Trots det tycker vi att det skulle vara intressant att djupare studera volymbaserade marknadsgrafer.
|
163 |
Reject Inference in Online PurchasesMumm, Lennart January 2012 (has links)
Abstract As accurately as possible, creditors wish to determine if a potential debtor will repay the borrowed sum. To achieve this mathematical models known as credit scorecards quantifying the risk of default are used. In this study it is investigated whether the scorecard can be improved by using reject inference and thereby include the characteristics of the rejected population when refining the scorecard. The reject inference method used is parcelling. Logistic regression is used to estimate probability of default based on applicant characteristics. Two models, one with and one without reject inference, are compared using Gini coefficient and estimated profitability. The results yield that, when comparing the two models, the model with reject inference both has a slightly higher Gini coefficient as well a showing an increase in profitability. Thus, this study suggests that reject inference does improve the predictive power of the scorecard, but in order to verify the results additional testing on a larger calibration set is needed
|
164 |
A Model Implementation of Incremental Risk ChargeForsman, Mikael January 2012 (has links)
Abstract In 2009 the Basel Committee on Banking Supervision released the final guidelines for computing capital for the Incremental Risk Charge, which is a complement to the traditional Value at Risk intended to measure the migration risk and the default risk in the trading book. Before Basel III banks will have to develop their own Incremental Risk Charge model following these guidelines. The development of such a model that computes the capital charge for a portfolio of corporate bonds is described in this thesis. Essential input parameters like the credit ratings of the underlying issuers, credit spreads, recovery rates at default, liquidity horizons and correlations among the positions in the portfolio will be discussed. Also required in the model is the transition matrix with probabilities of migrating between different credit states, which is measured by historical data from Moody´s rating institute. Several sensitivity analyses and stress tests are then made by generating different scenarios and running them in the model and the results of these tests are compared to a base case. As it turns out, the default risk contributes for the most part of the Incremental Risk Charge.
|
165 |
CPPI Structures on Funds DerivativesGallais, Arnaud January 2011 (has links)
Abstract With the ever-increasing complexity of financial markets and financial products, many investors now choose to benefit from a manager’s expertise by investing in a fund. This fueled a rapid growth of the fund industry over the past decades, and the recent emergence of complex derivatives products written on underlying funds. The diversity (hedge funds, mutual funds, funds of funds, managed accounts…) and the particularities (liquidity, specific risks) of funds call for adapted models and suited risk management. This thesis aims at understanding the issues and difficulties met when dealing with such products. In particular, we will deal in a great extent with CPPI (Constant Proportion Portfolio Insurance) structures written on funds, which combine the specificities of funds with particularities of such structures. Correctly assessing the corresponding market risks is a challenging issue, and is the subject of many investigations.
|
166 |
Implementation of CoVaR, A Measure for Systemic RiskBjarnadottir, Frida January 2012 (has links)
Abstract In recent years we have witnessed how distress can spread quickly through the financial system and threaten financial stability. Hence there has been increased focus on developing systemic risk indicators that can be used by central banks and others as a monitoring tool. For Sveriges Riksbank it is of great value to be able to quantify the risks that can threaten the Swedish financial system CoVaR is a systemic risk measure implemented here with that with that purpose. CoVaR, which stands for conditional Value at Risk, measures a financial institutions contribution to systemic risk and its contribution to the risk of other financial institutions. The conclusion is that CoVaR can together with other systemic risk indicators help get a better understanding of the risks threatening the stability of the Swedish financial system.
|
167 |
Higher Criticism Testing for Signal Detection in Rare And Weak ModelsBlomberg, Niclas January 2012 (has links)
examples - we need models for selecting a small subset of useful features from high-dimensional data, where the useful features are both rare and weak, this being crucial for e.g. supervised classfication of sparse high- dimensional data. A preceding step is to detect the presence of useful features, signal detection. This problem is related to testing a very large number of hypotheses, where the proportion of false null hypotheses is assumed to be very small. However, reliable signal detection will only be possible in certain areas of the two-dimensional sparsity-strength parameter space, the phase space. In this report, we focus on two families of distributions, N and χ2. In the former case, features are supposed to be independent and normally distributed. In the latter, in search for a more sophisticated model, we suppose that features depend in blocks, whose empirical separation strength asymptotically follows the non-central χ2ν-distribution. Our search for informative features explores Tukey's higher criticism (HC), which is a second-level significance testing procedure, for comparing the fraction of observed signi cances to the expected fraction under the global null. Throughout the phase space we investgate the estimated error rate, Err = (#Falsely rejected H0+ #Falsely rejected H1)/#Simulations, where H0: absence of informative signals, and H1: presence of informative signals, in both the N-case and the χ2ν-case, for ν= 2; 10; 30. In particular, we find, using a feature vector of the approximately same size as in genomic applications, that the analytically derived detection boundary is too optimistic in the sense that close to it, signal detection is still failing, and we need to move far from the boundary into the success region to ensure reliable detection. We demonstrate that Err grows fast and irregularly as we approach the detection boundary from the success region. In the χ2ν-case, ν > 2, no analytical detection boundary has been derived, but we show that the empirical success region there is smaller than in the N-case, especially as ν increases.
|
168 |
Classification of Probability of Defaultand Rating PhilosophiesGobeljic, Persa January 2012 (has links)
Basel II consists of international recommendations on banking regulations, mainly concerning how much capital banks and other financial institutions should be made to set aside in order to protect themselves from various types of risks. Implementing Basel II involves estimating risks; one of the main measurements is Probability of Default. Firm specific and macroeconomic risks cause obligors to default. Separating the two risk factors in order to define which of them affect the Probability of Default through the years. The aim of this thesis is to enable a separation of the risk variables in the structure of Probability of Default in order to classify the rating philosophy.
|
169 |
Money Management Principles for Mechanical TradersDatye, Shlok January 2012 (has links)
In his five books during 1990-2009, starting with Portfolio Management Formulas, Ralph Vince made accessible to mechanical traders with limited background in mathematics various important concepts in the field of money management. During this process, he coined and popularized the terms “optimal f" and “leverage space trading model." This thesis provides a sound mathematical understanding of these concepts, and adds various extensions and insights of its own. It also provides practical examples of how mechanical traders can use these concepts to their advantage. Although beneficial to all mechanical traders, the examples involve trading futures contracts, and practical details such as the back-adjustment of futures prices are provided along the way.
|
170 |
Analysis of Swedish pollutantsBerglund, David January 2012 (has links)
Abstract Today’s environmental reports contain flaws in the acquired data. This master thesis has the mission to alleviate the estimations of those flaws. The data in question, originates from Swedish industrial facilities. The thesis involves data-treatment by statistical analysis, which is done through fitting a model by the means of analysis of variance and multilevel modeling. The thesis also involves gathering and work with data from databases, as well as systematic treatment, sorting, categorization and evaluation of the data material. Calculations are made through the SAS statistical analysis program, which rendered estimates of fixed, linear and random effects. The results are presented through graphs and numerical estimates in the later part of the report. Calculations for estimations of the grand pollutant totals are conducted. These are compared to the observed data for relevance. Alternative ways on working on the problem at hand is discussed, as well as problems that have appeared during the work on the master thesis. The relevant code and calculations are attached towards the end. / Sammanfattning Dagens miljörapportering har brister i den rapporterade datan. Examensarbetet har som avsikt att underlätta skattningen av den saknade datan i rapporteringen, vars data härstammar från svenska företagsutsläpp. Arbetet innebär en databehandling via statistisk analys, vilken utförs genom modellanpassning via variansanalys och flernivåmodellering. Arbetet omfattar även hämtning och bearbetning av datamaterial ifrån databaser, så väl som systematisk behandling, sortering, indelning och tolkning av dataobservationer. Beräkningar är utförda i SAS statistiska analysprogram, vilket renderat skattningar och representationer av termer till fasta, linjära och slumpartade effekter. Dessa presenteras med siffror och grafer i senare delen av rapporten. Skattning av totaler beräknas och jämförs med observerad data. Problem och alternativa angreppssätt diskuteras, samt kod och beräkningar bifogas.
|
Page generated in 0.0651 seconds