1 |
Improved shrunken centroid method for better variable selection in cancer classification with high throughput molecular dataXukun, Li January 1900 (has links)
Master of Science / Department of Statistics / Haiyan Wang / Cancer type classification with high throughput molecular data has received much attention. Many methods have been published in this area. One of them is called PAM (nearest centroid shrunken algorithm), which is simple and efficient. It can give very good prediction accuracy. A problem with PAM is that this method selects too many genes, some of which may have no influence on cancer type. A reason for this phenomenon is that PAM assumes that all genes have identical distribution and give a common threshold parameter for genes selection. This may not hold in reality since expressions from different genes could have very different distributions due to complicated biological process. We propose a new method aimed to improve the ability of PAM to select informative genes. Keeping informative genes while reducing false positive variables can lead to more accurate classification result and help to pinpoint target genes for further studies. To achieve this goal, we introduce variable specific test based on Edgeworth expansion to select informative genes. We apply this test on each gene and select some genes based on the result of the test so that a large number of genes will be excluded. Afterward, soft thresholding with cross-validation can be further applied to decide a common threshold value. Simulation and real application show that our method can reduce the irrelevant information and select the informative genes more precisely. The simulation results give us more insight about where the newly proposed procedure could improve the accuracy, especially when the data set is skewed or unbalanced. The method can be applied to broad molecular data, including, for example, lipidomic data from mass spectrum, copy number data from genomics, eQLT analysis with GWAS data, etc. We expect the proposed method will help life scientists to accelerate discoveries with highthroughput data.
|
2 |
Optimering av lagernivåer vid distributionscentralen Bygg Ole / Optimization of inventory levels at the distribution central of Bygg OleGöransson, Gustav, Johnson, Mathias January 2016 (has links)
Detta examensarbetes syfte var att undersöka möjligheter till förbättring av hantering av lagernivåer för Bygg Ole Saltsjö-Boo. En kombination av aspekter från både systemteknik och industriell ekonomi har använts. I rapporten applicerades Guaranteed Service-Level modellen baserad på historisk försäljning i kombination relevanta teorier om lagerkostnad. Rapporten var begränsad till att behandla utvalda produkter med hög omsättning från två utvalda leverantörer till Bygg Ole. Efterfrågan för alla produkter i rapporten utom en är icke säsongsberoende. Särskild hänsyn har dessutom tagits till servicenivå, kapitalkostnader och variation i efterfråga. Resultatet gav att en implementering av modellen skulle ge lägre lagernivåer och därmed lägre lagerkostnader. Slutsatsen från rapporten var att modellen skulle kunna implementeras, eventuellt med höga administrativa kostnader i början. Bygg Ole har också en möjlighet att använda ett ordersystem baserat på den matematiska GSL-modellen (Guaranteed Service-Level) i kombination med prognoser över efterfrågan producerade av försäljningsavdelningen på Bygg Ole. Detta skulle potentiellt kunna öka precisionen i lagerhanteringen. Den nuvarande lagerräntan är relativt lågt bestämd och därför minskas de beräknade besparingarna från implementering av modellen. Om lagerräntan skulle vara högre skulle den ekonomiska fördelen med implementeringen vara tydligare. Rekommendationen till Bygg Ole är att tillämpa den rekommenderade GSL-modellen i kombination med ett system för prognos över efterfrågan på några utvalda produkter och sedan utvärdera resultatet. / The aim of this thesis was to examine possible improvements in the inventory management and procedure of ordering at Bygg Ole Saltsjö-Boo. A combination of aspects from both Systems Engineering and Industrial Engineering and Management has been used. In the report, a Guaranteed Service-Level model based on historical data of sales in combination with relevant theories about inventory carrying cost has been applied. The study was limited to specific chosen products with high sales from two selected suppliers of Bygg Ole. All these products in the study except one experienced low seasonal variety in demand. Furthermore special consideration was taken to service level, cost of capital and variability of demand. The result was that an implementation of the model would yield lower inventory levels and therefore lower carrying costs of inventory. The conclusion from the report was that the model could be implemented, although with possibly high administrative costs in the beginning. Bygg Ole also has a possibility of using an ordering system based on the mathematical GSL-model (Guaranteed Service-Level) in combination with forecasts of demand conducted by the sales department of Bygg Ole. This could potentially increase precision in the inventory management. The current inventory carrying charge is compounded relatively low and therefore decreases the calculated savings from implementing the model. If the carrying charge would be higher, the benefits of implementation would be more evident. The recommendation for Bygg Ole is to apply the recommended GSL-model in combination with a demand forecast planning system on a few selected products and then evaluate the result.
|
3 |
A Comparation Analysis on the Risk Model for Portfolio that Contains Equity DerivativesLin, Wan-Chun 23 June 2004 (has links)
none
|
4 |
Solvency Capital Requirement (SCR) for Market Risks : A quantitative assessment of the Standard formula and its adequacy for a Swedish insurance company / Kapitalbaskrav för marknadsrisker under Solvens II : En kvantitativ utvärdering av Standardformeln och dess lämplighet för ett svenskt försäkringsbolagWiding, Björn January 2016 (has links)
The purpose of this project is to validate the adequacy of the Standard formula, used to calculate the Solvency Capital Requirement (SCR), with respect to a Swedish insurance company. The sub-modules evaluated are Equity risk (type 1) and Interest rate risk. The validation uses a quantitative assessment and the concept of Value at Risk (VaR). Additionally, investment strategies for risk free assets are evaluated through a scenario based analysis. The findings support that the Equity shock of 39%, as proposed in the Standard formula, is appropriate for a diversified portfolio of global equities. Furthermore, to some extent; the Equity shock is also sufficient for a diversified global portfolio with an overweight of Swedish equities. Additionally, the findings shows that the Standard formula for Interest rate risks occasionally underestimates the true Interest rate risk. Furthermore, it’s shown that there are some advantage of selecting an investment strategy that stabilizes the Own fund of an insurance company rather than a strategy that minimizes the SCR. / Syftet med detta arbete är att utvärdera Standardformeln, som används för att beräkna solvenskapitalkravet (SCR) under Solvens II, med avseende på dess lämplighet för ett svensk försäkringsbolag. Modulerna som utvärderas är aktierisk (typ 1) och ränterisk. Utvärderingen genomförs med kvantitativa metoder och utifrån konceptet Value at Risk (VaR). Dessutom utvärderas investeringsstrategier för riskfria tillgångar genom en scenariobaserad analys. Resultaten stödjer att den av Standardformeln föreskrivna aktiechocken på -39 % är tillräcklig för en diversifierad global aktieportfölj. Dessutom är aktiechocken även tillräcklig för en diversifierad global portfölj med en viss övervikt mot svenska aktier. Vidare visar resultaten att Standardformeln under vissa omständigheter underskattar ränterisken. Slutligen visar den scenariobaserade analysen att det är fördelaktigt att välja en investeringsstrategi som stabiliserar Own fund, hellre än en strategi som minimerar SCR.
|
5 |
Cornish-Fisher Expansion and Value-at-Risk method in application to risk management of large portfoliosSjöstrand, Maria, Aktaş, Özlem January 2011 (has links)
One of the major problem faced by banks is how to manage the risk exposure in large portfolios. According to Basel II regulation banks has to measure the risk using Value-at-Risk with confidence level 99%. However, this regulation does not specify the way to calculate Valueat- Risk. The easiest way to calculate Value-at-Risk is to assume that portfolio returns are normally distributed. Altough, this is the most common way to calculate Value-at-Risk, there exists also other methods. The previous crisis shows that the regular methods are unfortunately not always enough to prevent bankruptcy. This paper is devoted to compare the classical methods of estimating risk with other methods such as Cornish-Fisher Expansion (CFVaR) and assuming generalized hyperbolic distribution. To be able to do this study, we estimate the risk in a large portfolio consisting of ten stocks. These stocks are chosen from the NASDAQ 100-list in order to have highly liquid stocks (bluechips). The stocks are chosen from different sectors to make the portfolio welldiversified. To investigate the impact of dependence between the stocks in the portfolio we remove the two most correlated stocks and consider the resulting eight stock portfolio as well. In both portfolios we put equal weight to the included stocks. The results show that for a well-diversified large portfolio none of the risk measures are violated. However, for a portfolio consisting of only one highly volatile stock we prove that we have a violation in the classical methods but not when we use the modern methods mentioned above.
|
Page generated in 0.0944 seconds