• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1020
  • 191
  • 1
  • Tagged with
  • 1212
  • 1212
  • 1212
  • 1199
  • 1199
  • 181
  • 168
  • 142
  • 115
  • 110
  • 109
  • 89
  • 88
  • 78
  • 78
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

The Use of Opioid Substances after undergoing Hematopoietic Stem Cell Transplant

Olcina, Elias, Arvid, Larsson January 2020 (has links)
In this paper, we are going to study the use of opioid substances after undergoing a Hematopoietic Stem Cell transplant in Sweden. The purpose of the study is divided into two parts, where the first objective is to display the use of opioids within the population using descriptive statistics. The second objective is to model the effect of opioid substances on survival using Cox Proportional Hazard regression. From the descriptive part, we can see that women tend to take opioids to a slightly greater extent than men, and that there are great differences among age groups were younger patients tend to withdraw more opioids. We create three different models measuring the effect of opioid use within the 43 first days after transplant on survival, correcting for four potential confounders: sex, age at transplant, source of transplant and relationship to the donor. The first two models are fitted with different measurements of survival time, and the third model is a stratified Cox regression based on model 2. The three models somewhat differ from each other in terms of estimated hazard ratios, however, we cannot show a statistically significant effect of opioid use within the first 43 days on survival in any of the models.
162

Simulation driven reinforcement learning : Improving synthetic enemies in flight simulators

Lindberg, Jesper January 2020 (has links)
This project focuses on how to implement an Artificial Intelligence (AI) -agent in a Tactical Simulator (Tacsi). Tacsi is a simulator used by Saab AB, one thing that the simulator is used for is pilot training. In this work, Tacsi will be used to simulate air to air combat. The agent uses Reinforcement Learning (RL) to be able to explore and learn how the simulator behaves. This knowledge will then be exploited when the agent tries to beat a computer-controlled synthetic enemy. The result of this study may be used to produce better synthetic enemies for pilot training. The RL-algorithm used in this work is deep Q-Learning, a well-known algorithm in the field. The results of the work show that it is possible to implement an RL-agent in Tacsi which can learn from the environment and be able to defeat the enemy, in some scenarios. The result produced by the algorithm verified that a RL-Agent works within Tacsi at Saab AB. Although the performance of the agent in this work is not impressive, there is a great opportunity for further development of the agent as well as the working environment.
163

On the Snell envelope approach to optimal switching and pricing Bermudan options

Hamdi, Ali January 2011 (has links)
This thesis consists of two papers related to systems of Snell envelopes. The first paper uses a system of Snell envelopes to formulate the problem of two-modes optimal switching for the full balance sheet in finite horizon. This means that the switching problem is formulated in terms of trade-off strategies between expected profit and cost yields, which act as obstacles to each other. Existence of a minimal solution of this system is obtained by using an approximation scheme. Furthermore, the optimal switching strategies are fully characterized. The second paper uses the Snell envelope to formulate the fair price of Bermudan options. To evaluate this formulation of the price, the optimal stopping strategy for such a contract must be estimated. This may be done recursively if some method of estimating conditional expectations is available. The paper focuses on nonparametric estimation of such expectations, by using regularization of a least-squares minimization, with a Tikhonov-type smoothing put on the partial diferential equation which characterizes the underlying price processes. This approach can hence be viewed as a combination of the Monte Carlo method and the PDE method for the estimation of conditional expectations. The estimation method turns out to be robust with regard tothe size of the smoothing parameter. / QC 20111013
164

Pricing Inflation Derivatives : A survey of short rate- and market models

Tewolde Berhan, Damr January 2012 (has links)
This thesis presents an overview of strategies for pricing inflation derivatives. The paper is structured as follows. Firstly, the basic definitions and concepts such as nominal-, real- and inflation rates are introduced. We introduce the benchmark contracts of the inflation derivatives market, and using standard results from no-arbitrage pricing theory, derive pricing formulas for linear contracts on inflation. In addition, the risk profile of inflation contracts is illustrated and we highlight how it’s captured in the models to be studied in the paper. We then move on to the main objective of the thesis and present three approaches for pricing inflation derivatives, where we focus in particular on two popular models. The first one, is a so called HJM approach, that models the nominal and real forward curves and relates the two by making an analogy to domestic and foreign fx rates. By the choice of volatility functions in the HJM framework, we produce nominal and real term structures similar to the popular interest-rate derivatives model of Hull-White. This approach was first suggested by Jarrow and Yildirim[1] and it’s main attractiveness lies in that it results in analytic pricing formulas for both linear and non-linear benchmark inflation derivatives. The second approach, is a so called market model, independently proposed by Mercurio[2] and Belgrade, Benhamou, and Koehler[4]. Just like the - famous - Libor Market Model, the modeled quantities are observable market entities, namely, the respective forward inflation indices. It is shown how this model as well - by the use of certain approximations - can produce analytic formulas for both linear and non-linear benchmark inflation derivatives. The advantages and shortcomings of the respective models are eveluated. In particular, we focus on how well the models calibrate to market data. To this end, model parameters are calibrated to market prices of year-on-year inflation floors; and it is evaluated how well market prices can be recovered by theoretical pricing with the calibrated model parameters. The thesis is concluded with suggestions for possible extensions and improvements.
165

The Market Graph : A study of its characteristics, structure & dynamics / Marknadsgrafen : En studie av dess karakteristika, struktur & dynamik

Budai, Daniel, Jallo, David January 2011 (has links)
In this thesis we have considered three different market graphs; one solely based on stock returns, another one based on stock returns with vertices weighted with a liquidity measure and lastly one based on correlations of volume fluctuations. Research is conducted on two different markets; the Swedish and the American stock market. We want to introduce graph theory as a method for representing the stock market in order to show that one can more fully understand the structural properties and dynamics of the stock market by studying the market graph. We found many signs of increased globalization by studying the clustering coefficient and the correlation distribution. The structure of the market graph is such that it pinpoints specific sectors when the correlation threshold is increased and different sectors are found in the two different markets. For low correlation thresholds we found groups of independent stocks that can be used as diversified portfolios. Furthermore, the dynamics revealed that it is possible to use the daily absolute change in edge density as an indicator for when the market is about to make a downturn. This could be an interesting topic for further studies. We had hoped to get additional results by considering volume correlations, but that did not turn out to be the case. Regardless of that, we think that it would be interesting to study volume based market graphs further. / I denna uppsats har vi tittat på tre olika marknadsgrafer; en enbart baserad på avkastning, en baserad på avkastning med likvidviktade noder och slutligen en baserad på volymkorrelationer. Studien är gjord på två olika marknader; den svenska och den amerikanska aktiemarknaden. Vi vill introducera grafteori som ett verktyg för att representera aktiemarknaden och visa att man bättre kan förstå aktiemarknadens strukturerade egenskaper och dynamik genom att studera marknadsgrafen. Vi fann många tecken på en ökad globalisering genom att titta på klusterkoefficienten och korrelationsfördelningen. Marknadsgrafens struktur är så att den lokaliserar specifika sektorer när korrelationstaket ökas och olika sektorer är funna för de två olika marknaderna. För låga korrelationstak fann vi grupper av oberoende aktier som kan användas som diversifierade portföljer. Vidare, avslöjar dynamiken att det är möjligt att använda daglig absolut förändring i bågdensiteten som en indikator för när marknaden är på väg att gå ner. Detta kan vara ett intressant ämne för vidare studier. Vi hade hoppats på att erhålla ytterligare resultat genom att titta på volymkorrelationer men det visade sig att så inte var fallet. Trots det tycker vi att det skulle vara intressant att djupare studera volymbaserade marknadsgrafer.
166

Reject Inference in Online Purchases

Mumm, Lennart January 2012 (has links)
Abstract   As accurately as possible, creditors wish to determine if a potential debtor will repay the borrowed sum. To achieve this mathematical models known as credit scorecards quantifying the risk of default are used. In this study it is investigated whether the scorecard can be improved by using reject inference and thereby include the characteristics of the rejected population when refining the scorecard. The reject inference method used is parcelling. Logistic regression is used to estimate probability of default based on applicant characteristics. Two models, one with and one without reject inference, are compared using Gini coefficient and estimated profitability. The results yield that, when comparing the two models, the model with reject inference both has a slightly higher Gini coefficient as well a showing an increase in profitability. Thus, this study suggests that reject inference does improve the predictive power of the scorecard, but in order to verify the results additional testing on a larger calibration set is needed
167

A Model Implementation of Incremental Risk Charge

Forsman, Mikael January 2012 (has links)
Abstract In 2009 the Basel Committee on Banking Supervision released the final guidelines for computing capital for the Incremental Risk Charge, which is a complement to the traditional Value at Risk intended to measure the migration risk and the default risk in the trading book. Before Basel III banks will have to develop their own Incremental Risk Charge model following these guidelines. The development of such a model that computes the capital charge for a portfolio of corporate bonds is described in this thesis. Essential input parameters like the credit ratings of the underlying issuers, credit spreads, recovery rates at default, liquidity horizons and correlations among the positions in the portfolio will be discussed. Also required in the model is the transition matrix with probabilities of migrating between different credit states, which is measured by historical data from Moody´s rating institute. Several sensitivity analyses and stress tests are then made by generating different scenarios and running them in the model and the results of these tests are compared to a base case. As it turns out, the default risk contributes for the most part of the Incremental Risk Charge.
168

CPPI Structures on Funds Derivatives

Gallais, Arnaud January 2011 (has links)
Abstract With the ever-increasing complexity of financial markets and financial products, many investors now choose to benefit from a manager’s expertise by investing in a fund. This fueled a rapid growth of the fund industry over the past decades, and the recent emergence of complex derivatives products written on underlying funds. The diversity (hedge funds, mutual funds, funds of funds, managed accounts…) and the particularities (liquidity, specific risks) of funds call for adapted models and suited risk management. This thesis aims at understanding the issues and difficulties met when dealing with such products. In particular, we will deal in a great extent with CPPI (Constant Proportion Portfolio Insurance) structures written on funds, which combine the specificities of funds with particularities of such structures. Correctly assessing the corresponding market risks is a challenging issue, and is the subject of many investigations.
169

Implementation of CoVaR, A Measure for Systemic Risk

Bjarnadottir, Frida January 2012 (has links)
Abstract In recent years we have witnessed how distress can spread quickly through the financial system and threaten financial stability. Hence there has been increased focus on developing systemic risk indicators that can be used by central banks and others as a monitoring tool. For Sveriges Riksbank it is of great value to be able to quantify the risks that can threaten the Swedish financial system CoVaR is a systemic risk measure implemented here with that with that purpose. CoVaR, which stands for conditional Value at Risk, measures a financial institutions contribution to systemic risk and its contribution to the risk of other financial institutions. The conclusion is that CoVaR can together with other systemic risk indicators help get a better understanding of the risks threatening the stability of the Swedish financial system.
170

Higher Criticism Testing for Signal Detection in Rare And Weak Models

Blomberg, Niclas January 2012 (has links)
examples - we need models for selecting a small subset of useful features from high-dimensional data, where the useful features are both rare and weak, this being crucial for e.g. supervised classfication of sparse high- dimensional data. A preceding step is to detect the presence of useful features, signal detection. This problem is related to testing a very large number of hypotheses, where the proportion of false null hypotheses is assumed to be very small. However, reliable signal detection will only be possible in certain areas of the two-dimensional sparsity-strength parameter space, the phase space. In this report, we focus on two families of distributions, N and χ2. In the former case, features are supposed to be independent and normally distributed. In the latter, in search for a more sophisticated model, we suppose that features depend in blocks, whose empirical separation strength asymptotically follows the non-central χ2ν-distribution. Our search for informative features explores Tukey's higher criticism (HC), which is a second-level significance testing procedure, for comparing the fraction of observed signi cances to the expected fraction under the global null. Throughout the phase space we investgate the estimated error rate, Err = (#Falsely rejected H0+ #Falsely rejected H1)/#Simulations, where H0: absence of informative signals, and H1: presence of informative signals, in both the N-case and the χ2ν-case, for ν= 2; 10; 30. In particular, we find, using a feature vector of the approximately same size as in genomic applications, that the analytically derived detection boundary is too optimistic in the sense that close to it, signal detection is still failing, and we need to move far from the boundary into the success region to ensure reliable detection. We demonstrate that Err grows fast and irregularly as we approach the detection boundary from the success region. In the χ2ν-case, ν > 2, no analytical detection boundary has been derived, but we show that the empirical success region there is smaller than in the N-case, especially as ν increases.

Page generated in 0.0859 seconds