• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1373
  • 379
  • 377
  • 77
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 2518
  • 1657
  • 1214
  • 1211
  • 1199
  • 452
  • 387
  • 363
  • 338
  • 338
  • 324
  • 323
  • 318
  • 308
  • 239
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Modelling Apartment Prices with the Multiple Linear Regression Model / Modellering av lagenhetspriser med multipel linjar regression

Gustafsson, Alexander, Wogenius, Sebastian January 2014 (has links)
This thesis examines factors that are of most statistical significance for the sales prices of apartments in the Stockholm City Centre. Factors examined are address, area, balcony, construction year, elevator, fireplace, floor number, maisonette, monthly fee, penthouse and number of rooms. On the basis of this examination, a model for predicting prices of apartments is constructed. In order to evaluate how the factors influence the price, this thesis analyses sales statistics and the mathematical method used is the multiple linear regression model. In a minor case-study and literature review, included in this thesis, the relationship between proximity to public transport and the prices of apartments in Stockholm are examined. The result of this thesis states that it is possible to construct a model, from the factors analysed, which can predict the prices of apartments in Stockholm City Centre with an explanation degree of 91% and a two million SEK confidence interval of 95%. Furthermore, a conclusion can be drawn that the model predicts lower priced apartments more accurately. In the case-study and literature review, the result indicates support for the hypothesis that proximity to public transport is positive for the price of an apartment. However, such a variable should be regarded with caution due to the purpose of the modelling, which differs between an individual application and a social economic application / Denna uppsats undersöker faktorer som är av störst statistisk signifikans för priset vid försäljning av lägenheter i Stockholms innerstad. Faktorer som undersöks är adress, yta, balkong, byggår, hiss, kakelugn, våningsnummer, etage, månadsavgift, vindsvåning och antal rum. Utifrån denna undersökning konstrueras en modell för att predicera priset på lägenheter. För att avgöra vilka faktorer som påverkar priset på lägenheter analyseras försäljningsstatistik. Den matematiska metoden som används är multipel linjär regressionsanalys. I en mindre litteratur- och fallstudie, inkluderad i denna uppsats, undersöks sambandet mellan närhet till kollektivtrafik och priset på läagenheter i Stockholm.   Resultatet av denna uppsats visar att det är möjligt att konstruera en modell, utifrån de faktorer som undersöks, som kan predicera priset på läagenheter i Stockholms innerstad med en förklaringsgrad på 91 % och ett två miljoner SEK konfidensintervall på 95 %. Vidare dras en slutsats att modellen preciderar lägenheter med ett lägre pris noggrannare. I litteratur- och fallstudien indikerar resultatet stöd för hypotesen att närhet till kollektivtrafik är positivt för priset på en lägenhet. Detta skall dock betraktas med försiktighet med anledning av syftet med modelleringen vilket skiljer sig mellan en individuell tillämpning och en samhällsekonomisk tillämpning.
352

The New Standardized Approach for Measuring Counterparty Credit Risk

Jonsson, Sara, Rönnlund, Beatrice January 2014 (has links)
This study investigates the differences in calculationof exposure at default between the current exposure method (CEM) and the newstandardized approach for measuring counterparty credit risk exposures (SA-CCR)for over the counter (OTC) derivatives. The study intends to analyze theconsequence of the usage of different approaches for netting as well as the differencesin EAD between asset classes. After implementing both models and calculating EADon real trades of a Swedish commercial bank it was obvious that SA-CCR has ahigher level of complexity than its predecessor. The results from this studyindicate that SA-CCR gives a lower EAD than CEM because of the higherrecognition of netting but higher EAD when netting is not allowed. Foreignexchange derivatives are affected to a higher extent than interest ratederivatives in this particular study. Foreign exchange derivatives got lowerEAD both when netting was allowed and when netting was not allowed under SA-CCR.A change of method for calculating EAD from CEM to SA-CCR could result in lowerminimum capital requirements
353

On Lapse risk factors in Solvency II

Boros, Daniel January 2014 (has links)
In the wake of the sub-prime crisis of 2008, the European Insurance and Occupational Pensions Authority issued the Solvency II directive, aiming at replacing the obsolete Solvency I framework by 2016. Among the quantitative requirements of Solvency II, a measure for an insurance firms solvency risk, the solvency risk capital, is found. It aims at establishing the amount of equity the company needs to hold to be able to meet its insurance obligations with a probability of 0.995 over the coming year. The SCR of a company is essentially built up by the SCR induced by a set of quantifiable risks. Among these, risks originating from the take up rate of contractual options, lapse risks, are included. In this thesis, the contractual options of a life insurer have been identified and risk factors aiming at capturing the risks arising are suggested. It has been concluded that a risk factor estimating the size of mass transfer events captures the risk arising through the resulting rescaling of the balance sheet. Further, a risk factor modeling the deviation of the Company's assumption for the yearly transfer rate is introduced to capture the risks induced by the characteristics of traditional life insurance and unit-linked insurance contracts upon transfer. The risk factors are modeled in a manner to introduce co-dependence with equity returns as well as interest rates of various durations and the model parameters are estimated using statistical methods for Norwegian transfer-frequency data obtained from Finans Norge. The univariate and multivariate properties of the models are investigated in a scenario setting and it is concluded the the suggested models provide predominantly plausible results for the mass-lapse risk factors. However, the performance of the models for the risk factors aiming at capturing deviations in the transfer assumptions are questionable, why two means of increasing its validity have been proposed.
354

Allocation Methods for Alternative Risk Premia Strategies / Allokeringsmetoder för alternativa riskpremier

Drugge, Daniel January 2014 (has links)
We use regime switching and regression tree methods to evaluate performance in the risk premia strategies provided by Deutsche Bank and constructed from U.S. research data from the Fama French library. The regime switching method uses the Baum-Welch algorithm at its core and splits return data into a normal and a turbulent regime. Each regime is independently evaluated for risk and the estimates are then weighted together according to the expected value of the proceeding regime. The regression tree methods identify macro-economic states in which the risk premia perform well or poorly and use these results to allocate between risk premia strategies. The regime switching method proves to be mostly unimpressive but has its results boosted by investing less into risky assets as the probability of an upcoming turbulent regime becomes larger. This proves to be highly effective for all time periods and for both data sources. The regression tree method proves the most effective when making the assumption that we know all macro-economic data the same month as it is valid for. Since this is an unrealistic assumption the best method seems to be to evaluate the performance of the risk premia strategy using macro-economic data from the previous quarter.
355

Estimation of Loss Given Default for Low Default Portfolios

Dahlin, Fredrik, Storkitt, Samuel January 2014 (has links)
The Basel framework allows banks to assess their credit risk by using their own estimates of Loss Given Default (LGD). However, for a Low Default Portfolio (LDP), estimating LGD is difficult due to shortage of default data. This study evaluates different LGD estimation approaches in an LDP setting by using pooled industry data obtained from a subset of the PECDC LGD database. Based on the characteristics of a LDP a Workout LGD approach is suggested. Six estimation techniques, including OLS regression, Ridge regression, two techniques combining logistic regressions with OLS regressions and two tree models, are tested. All tested models give similar error levels when tested against the data but the tree models might produce rather different estimates for specific exposures compared to the other models. Using historical averages yield worse results than the tested models within and out of sample but are not considerably worse out of time.
356

Semi-Markov modelling in a Gibbssampling algorithm for NIALM

Monin Nylund, Jean-Alexander January 2014 (has links)
Residential households in the EU are estimated to have a savings potential of around 27% [1]. The question yet remains on how to realize this savings potential. Non-Intrusive Appliance Load Monitoring (NIALM) aims to disaggregate the combination of household appliance energy signals with only measurements of the total household power load. The core of this thesis has been the implementation of an extension to a Gibbs sampling model with Hidden Markov Models for energy disaggregation. The goal has been to improve overall performance, by including the duration times of electrical appliances in the probabilistic model. The final algorithm was evaluated in comparison to the base algorithm, but results remained at the very best inconclusive, due to the model's inherent limitations. The work was performed at the Swedish company Watty. Watty develops the first energy data analytic tool that can automate the energy efficiency process in buildings.
357

Prediktering av VD-löner i svenska onoterade aktiebolag / Prediction of CEO salaries in Swedish unlisted companies

Edberg, Erik January 2015 (has links)
Lönen till den verkställande direktören bestäms i motsats till kollektivarbetares individuellt och oberoende av kollektivavtal. Lönen bestäms av företagets styrelse och utgår utifrån en uppskattad värdering av bland annat komplexiteten i arbetet, VD:ns personliga egenskaper, marknadens värdering av liknande uppdrag och tillgången på tänkbara kandidater. Uppsatsens syfte är att konstruera en modell för att prediktera marknadslönen för en befintlig eller blivande VD. Vidare studeras utformningen av ersättningsstrukturen i syfte att finna den ersättningsstruktur som maximerar VD:ns prestation. Ur studien framgår att det är möjligt att prediktera VD-lönen för anställda VD:ar i onoterade aktiebolag med 64% förklaringsgrad. Variationen i VD-lönen förklaras av sex oberoende variabler, fyra variabler som representerar uppdragets karaktär och två variabler som är prestationsrelaterade. Högst förklaringsgrad ger variabeln Omsättning, vilken förklarar knappt 40% av variationen i VD-lönen. Av studien framgår att den optimala ersättningsstrukturen ser olika ut för olika företag. Vidare ges rekommendationer för vad den rörliga ersättningen bör baseras på i syfte att maximera VD:ns prestation. / The CEO’s remuneration is in contradiction to the union labours, set individually and independent from union agreements. The company board determines the remuneration. It’s based on an estimated valuation of variables such as job characteristics, personal qualities of the CEO, market valuation of similar tasks and the availability of possible candidates. The purpose of this thesis is to create a model to predict the market remuneration for a current or forthcoming CEO. Further, the compensation structure will be examined, aiming to find the compensation structure that maximizes the CEO’s performance. This thesis showes that it is possible to predict the CEO remuneration for employed CEOs in unlisted corporations with 64-percentage explanation rate. The variance is explained by six covariates, four covariates representing job characteristics and two related to company performance. The highest explanation rate is given by the covariate turnover, which explains just below 40-percentage of the remuneration variance. This study shows that the optimal compensation structure is different for different companies. Further, recommendations for what the variable remuneration should be based on, in order to maximize the CEO’s performance.
358

Predicting Bankruptcy with Machine Learning Models

Åkerblom, Thea January 2022 (has links)
This thesis explores the predictive power of different machine learning algorithms in Swedish firm defaults. Both firm-specific variables and macroeconomic variables are used to calculate the estimated probabilities of firm default. Four different algorithms are used to predict default; Random Forest, Adaboost, Feed Forward Neural Network and Long Short Term Memory Neural Network (LSTM). These models are compared to a classical Logistic Classification model that acts as a benchmark model. The data used is a panel data set of quarterly observations. The study is done on data for the period 2000 to 2018. To evaluate the models Precision and Recall are calculated and compared between the models. The LSTM model performs the best of all five fitted models and correctly classifies 60 % of all defaults in the test data. The data is supplied by the Riksbank, the Swedish central bank. It consists of two data sets, one from Upplysningscentralen AB with firm specific variables, and one from the Riksbank with the macroeconomic variables. Keywords: LTSM, Neural Network, Adaboost, Random Forest, Machine Learning, Default, Panel Data, Longitudinal Data, Risk, Prediction, Precision, Recall
359

Furstenberg's conjecture and measure rigidity for some classes of non-abelian affine actions on tori / Furstenbergs förmodan och måttrigiditet för några klasser av icke-abelska affina verkningar på torusar

Zickert, Gustav January 2015 (has links)
In 1967 Furstenberg proved that the set {2n3mα(mod 1) | n, m ∈N} is dense in the circle for any irrational α. He also made the following famous measure rigidity conjecture: the only ergodic measures on the circle invariant under both x —> 2x and x —> 3x are the Lebesgue measure and measures supported on a finite set. In this thesis we discuss both Furstenberg’s theorem and his conjecture, as well as the partial solution of the latter given by Rudolph. Following Matheus’presentation of Avila’s ideas for a proof of a weak version of Rudolph’s theorem, we prove a result on extending measure preservation from a semigroup action to a larger semigroup action. Using this result we obtain restrictions on the set of invariant measures for certain classes of non-abelian affine actions on tori. We also study some general properties of affine abelian and non-abelian actions and we show that analogues of Furstenberg’s theorem hold for affine actions on the circle. / 1967 bevisade Furstenberg att mängden {2n3mα(mod 1) | n, m ∈N} är tät i cirkeln för alla irrationella tal α. Furstenberg ligger även bakom följande berömda förmodan: de enda ergodiska måtten påcirkeln som är invarianta under både x 􏰀—> 2x och x 􏰀—> 3x är Lebesguemåttet och mått med ändligt stöd. I det här examensarbetet behandlar vi Furstenbergs sats, Furstenbergs förmodan och Rudolphs sats. Vi följer Matheus presentation av Avilas idéer för ett bevis av en svag variant av Rudolphs sats och vi bevisar att en måttbevarande semigruppverkan under vissa antaganden kan utökas till en semigruppverkan av en större semigrupp. Med hjälp av detta resultat erhåller vi begränsningar av mängden av mått invarianta under vissa klasser av icke-abelska affina verkningar påtorusen. Vi studerar även allmänna egenskaper hos affina abelska och icke-abelska verkningar och vi visar att satser analoga med Furstenbergs sats håller för affina verkningar påcirkeln.
360

Spectral Data Processing for Steel Industry / Spektral databehandling för stålindustrin

Bisot, Clémence January 2015 (has links)
For steel industry, knowing and understanding characteristics of a steel strip surface at every steps of the production process is a key element to control final product quality. Today as the quality requirements increase this task gets more and more important. The surface of new steel grades with complex chemical compositions has behaviors especially hard to master. For those grades in particular, surface control is critical and difficult. One of the promising technics to assess the problem of surface quality control is spectra analysis. Over the last few years, ArcelorMittal, world’s leading integrated steel and mining company, has led several projects to investigate the possibility of using devices to measure light spectrum of their product at different stage of the production. The large amount of data generated by these devices makes it absolutely necessary to develop efficient data treatment pipelines to get meaningful information out of the recorded spectra. In this thesis, we developed mathematical models and statistical tools to treat signal measured with spectrometers in the framework of different research projects. / För stålindustrin, att veta och förstå ytegenskaperna på ett stålband vid varje steg i produktionsprocessen är en nyckelfaktor för att styra slutproduktens kvalitet. Den senaste tidens ökande kvalitetskraven har gjort denna uppgift allt mer viktigare. Ytan på nya stål kvaliteter med komplexa kemiska sammansättningar har egenskaper som är särskilt svårt att hantera. För dess kvaliteter är ytkontroll kritisk och svår. En av de tekniker som används för att kontrollera ytans kvalitet är spektrum analys. Arcelor Mittal, världens ledande integrerade stål- och gruvföretag, har under de senaste åren lett flera projekt för att undersöka möjligheten att använda mätinstrument för att mäta spektrum ljuset från sin produkt i olika stadier av produktionen. En av de tekniker som används för att kontrollera ytans kvalitet är spektrum analys. I denna avhandling har vi utvecklat matematiska modeller och statistiska verktyg för att kunna hanskas med signaler som är uppmätt med spektrometrar inom ramen av olika forskningsprojekt hos Arcelor Mittal.

Page generated in 0.0411 seconds