• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1374
  • 382
  • 379
  • 77
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 2524
  • 1657
  • 1214
  • 1211
  • 1199
  • 458
  • 393
  • 363
  • 344
  • 344
  • 324
  • 323
  • 318
  • 308
  • 239
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

MANOVA I PRAKTIKEN : Studie med multivariat analys i fokus och praktisk tillämpning

Magnusson, Anna January 2020 (has links)
No description available.
282

Blir det sannolikt en snöfylld jul? : En statistisk prognos med hjälp av Markovkedjor

Tyni, Niklas January 2020 (has links)
Syftet med denna uppsats var att med hjälp av Markovkedjor göra en statistisk prognos för ensnöfylld jul för tre svenska väderstationer. Datamängderna som ligger till grund förskattningarna av övergångssanolikheterna sorterades så endast perioden 17 december – 31december för varje år användes. Fundamentet för prognosen bygger på den underliggandeteorin, framförallt av den så kallade Markovegenskapen och Chapman-Kolmogorovsekvation. Prognosen har två utgångspunkter: antingen är det snö den 17 december eller inte.Uppsatsen undersöker också kedjornas ergodicitet och kedjornas stationära fördelningar -experimentellt, och med en teoretisk lösning. Sannolikheten för snö på julafton ansågs somgoda, förutsatt att det är snö den 17 dec. Att tillämpa Markovkedjor för denna sortens problemfungerade bra, vilket beror på det beroende som finns mellan dagens väder och vädretimorgon. / The purpose of this paper was to make a statistical forecast for a snow-filled Christmas forthree Swedish weather stations with the help of Markov chains. The data on which thetransition probabilities estimates were based on, were sorted so that only the period December17 – December 31 for each year was used. The basis for the forecast is based on theunderlying theory, the so-called Markov property and the Chapman-Kolmogorov equation.The forecast has two starting points: whether it is snow on December 17 or not. The thesisalso examined the ergodicity and the stationary distributions of the Markov chains –experimentally and with a theoretical solution. The probability of snow on Christmas Eve wasconsidered good, provided there is snow on Dec 17. Applying Markov chains to this kind ofproblem worked well, due to the dependency that exists between today’s weather and theweather tomorrow
283

Overcoming a financial crisis : A study of which factors predicts the impact of a rapid economic change

Artman, Arvid January 2020 (has links)
This paper investigates which factors best predict the economic state of a Swedish municipality after the 2008 crisis by constructing a linear model that regresses the change in the unemployment rate on a set of variables. The variables used for the model were from a dataset put together using data from a government service and were selected for the model using Bayesian information criterion. From this procedure, a model with six independent variables was estimated. The model’s statistics were examined, and the model was subsequentially tried against the five multiple linear regression assumptions. It was concluded that the model did not fulfil the assumption of homoscedasticity, and because of this, the dependent variable was transformed into a logarithm, thus yielding a log-lin model. This model ended up fulfilling every assumption and had higher explanatory power than the previous model. It is concluded that the variables that denote the number of newly registered businesses per 1000 residents, the share of residents with a high education, the fraction of net-commuters, the number of refugees received with a residence permit per 1000 residents, total net investments per person, the share of long term unemployed residents and the population size all prove significant when included together in a log-lin model of the change in the unemployment rate.
284

The Use of Opioid Substances after undergoing Hematopoietic Stem Cell Transplant

Olcina, Elias, Arvid, Larsson January 2020 (has links)
In this paper, we are going to study the use of opioid substances after undergoing a Hematopoietic Stem Cell transplant in Sweden. The purpose of the study is divided into two parts, where the first objective is to display the use of opioids within the population using descriptive statistics. The second objective is to model the effect of opioid substances on survival using Cox Proportional Hazard regression. From the descriptive part, we can see that women tend to take opioids to a slightly greater extent than men, and that there are great differences among age groups were younger patients tend to withdraw more opioids. We create three different models measuring the effect of opioid use within the 43 first days after transplant on survival, correcting for four potential confounders: sex, age at transplant, source of transplant and relationship to the donor. The first two models are fitted with different measurements of survival time, and the third model is a stratified Cox regression based on model 2. The three models somewhat differ from each other in terms of estimated hazard ratios, however, we cannot show a statistically significant effect of opioid use within the first 43 days on survival in any of the models.
285

Simulation driven reinforcement learning : Improving synthetic enemies in flight simulators

Lindberg, Jesper January 2020 (has links)
This project focuses on how to implement an Artificial Intelligence (AI) -agent in a Tactical Simulator (Tacsi). Tacsi is a simulator used by Saab AB, one thing that the simulator is used for is pilot training. In this work, Tacsi will be used to simulate air to air combat. The agent uses Reinforcement Learning (RL) to be able to explore and learn how the simulator behaves. This knowledge will then be exploited when the agent tries to beat a computer-controlled synthetic enemy. The result of this study may be used to produce better synthetic enemies for pilot training. The RL-algorithm used in this work is deep Q-Learning, a well-known algorithm in the field. The results of the work show that it is possible to implement an RL-agent in Tacsi which can learn from the environment and be able to defeat the enemy, in some scenarios. The result produced by the algorithm verified that a RL-Agent works within Tacsi at Saab AB. Although the performance of the agent in this work is not impressive, there is a great opportunity for further development of the agent as well as the working environment.
286

On the Snell envelope approach to optimal switching and pricing Bermudan options

Hamdi, Ali January 2011 (has links)
This thesis consists of two papers related to systems of Snell envelopes. The first paper uses a system of Snell envelopes to formulate the problem of two-modes optimal switching for the full balance sheet in finite horizon. This means that the switching problem is formulated in terms of trade-off strategies between expected profit and cost yields, which act as obstacles to each other. Existence of a minimal solution of this system is obtained by using an approximation scheme. Furthermore, the optimal switching strategies are fully characterized. The second paper uses the Snell envelope to formulate the fair price of Bermudan options. To evaluate this formulation of the price, the optimal stopping strategy for such a contract must be estimated. This may be done recursively if some method of estimating conditional expectations is available. The paper focuses on nonparametric estimation of such expectations, by using regularization of a least-squares minimization, with a Tikhonov-type smoothing put on the partial diferential equation which characterizes the underlying price processes. This approach can hence be viewed as a combination of the Monte Carlo method and the PDE method for the estimation of conditional expectations. The estimation method turns out to be robust with regard tothe size of the smoothing parameter. / QC 20111013
287

Pricing Inflation Derivatives : A survey of short rate- and market models

Tewolde Berhan, Damr January 2012 (has links)
This thesis presents an overview of strategies for pricing inflation derivatives. The paper is structured as follows. Firstly, the basic definitions and concepts such as nominal-, real- and inflation rates are introduced. We introduce the benchmark contracts of the inflation derivatives market, and using standard results from no-arbitrage pricing theory, derive pricing formulas for linear contracts on inflation. In addition, the risk profile of inflation contracts is illustrated and we highlight how it’s captured in the models to be studied in the paper. We then move on to the main objective of the thesis and present three approaches for pricing inflation derivatives, where we focus in particular on two popular models. The first one, is a so called HJM approach, that models the nominal and real forward curves and relates the two by making an analogy to domestic and foreign fx rates. By the choice of volatility functions in the HJM framework, we produce nominal and real term structures similar to the popular interest-rate derivatives model of Hull-White. This approach was first suggested by Jarrow and Yildirim[1] and it’s main attractiveness lies in that it results in analytic pricing formulas for both linear and non-linear benchmark inflation derivatives. The second approach, is a so called market model, independently proposed by Mercurio[2] and Belgrade, Benhamou, and Koehler[4]. Just like the - famous - Libor Market Model, the modeled quantities are observable market entities, namely, the respective forward inflation indices. It is shown how this model as well - by the use of certain approximations - can produce analytic formulas for both linear and non-linear benchmark inflation derivatives. The advantages and shortcomings of the respective models are eveluated. In particular, we focus on how well the models calibrate to market data. To this end, model parameters are calibrated to market prices of year-on-year inflation floors; and it is evaluated how well market prices can be recovered by theoretical pricing with the calibrated model parameters. The thesis is concluded with suggestions for possible extensions and improvements.
288

The Market Graph : A study of its characteristics, structure & dynamics / Marknadsgrafen : En studie av dess karakteristika, struktur & dynamik

Budai, Daniel, Jallo, David January 2011 (has links)
In this thesis we have considered three different market graphs; one solely based on stock returns, another one based on stock returns with vertices weighted with a liquidity measure and lastly one based on correlations of volume fluctuations. Research is conducted on two different markets; the Swedish and the American stock market. We want to introduce graph theory as a method for representing the stock market in order to show that one can more fully understand the structural properties and dynamics of the stock market by studying the market graph. We found many signs of increased globalization by studying the clustering coefficient and the correlation distribution. The structure of the market graph is such that it pinpoints specific sectors when the correlation threshold is increased and different sectors are found in the two different markets. For low correlation thresholds we found groups of independent stocks that can be used as diversified portfolios. Furthermore, the dynamics revealed that it is possible to use the daily absolute change in edge density as an indicator for when the market is about to make a downturn. This could be an interesting topic for further studies. We had hoped to get additional results by considering volume correlations, but that did not turn out to be the case. Regardless of that, we think that it would be interesting to study volume based market graphs further. / I denna uppsats har vi tittat på tre olika marknadsgrafer; en enbart baserad på avkastning, en baserad på avkastning med likvidviktade noder och slutligen en baserad på volymkorrelationer. Studien är gjord på två olika marknader; den svenska och den amerikanska aktiemarknaden. Vi vill introducera grafteori som ett verktyg för att representera aktiemarknaden och visa att man bättre kan förstå aktiemarknadens strukturerade egenskaper och dynamik genom att studera marknadsgrafen. Vi fann många tecken på en ökad globalisering genom att titta på klusterkoefficienten och korrelationsfördelningen. Marknadsgrafens struktur är så att den lokaliserar specifika sektorer när korrelationstaket ökas och olika sektorer är funna för de två olika marknaderna. För låga korrelationstak fann vi grupper av oberoende aktier som kan användas som diversifierade portföljer. Vidare, avslöjar dynamiken att det är möjligt att använda daglig absolut förändring i bågdensiteten som en indikator för när marknaden är på väg att gå ner. Detta kan vara ett intressant ämne för vidare studier. Vi hade hoppats på att erhålla ytterligare resultat genom att titta på volymkorrelationer men det visade sig att så inte var fallet. Trots det tycker vi att det skulle vara intressant att djupare studera volymbaserade marknadsgrafer.
289

Reject Inference in Online Purchases

Mumm, Lennart January 2012 (has links)
Abstract   As accurately as possible, creditors wish to determine if a potential debtor will repay the borrowed sum. To achieve this mathematical models known as credit scorecards quantifying the risk of default are used. In this study it is investigated whether the scorecard can be improved by using reject inference and thereby include the characteristics of the rejected population when refining the scorecard. The reject inference method used is parcelling. Logistic regression is used to estimate probability of default based on applicant characteristics. Two models, one with and one without reject inference, are compared using Gini coefficient and estimated profitability. The results yield that, when comparing the two models, the model with reject inference both has a slightly higher Gini coefficient as well a showing an increase in profitability. Thus, this study suggests that reject inference does improve the predictive power of the scorecard, but in order to verify the results additional testing on a larger calibration set is needed
290

A Model Implementation of Incremental Risk Charge

Forsman, Mikael January 2012 (has links)
Abstract In 2009 the Basel Committee on Banking Supervision released the final guidelines for computing capital for the Incremental Risk Charge, which is a complement to the traditional Value at Risk intended to measure the migration risk and the default risk in the trading book. Before Basel III banks will have to develop their own Incremental Risk Charge model following these guidelines. The development of such a model that computes the capital charge for a portfolio of corporate bonds is described in this thesis. Essential input parameters like the credit ratings of the underlying issuers, credit spreads, recovery rates at default, liquidity horizons and correlations among the positions in the portfolio will be discussed. Also required in the model is the transition matrix with probabilities of migrating between different credit states, which is measured by historical data from Moody´s rating institute. Several sensitivity analyses and stress tests are then made by generating different scenarios and running them in the model and the results of these tests are compared to a base case. As it turns out, the default risk contributes for the most part of the Incremental Risk Charge.

Page generated in 0.052 seconds