• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 323
  • 258
  • 72
  • 53
  • 47
  • 40
  • 32
  • 30
  • 20
  • 11
  • 9
  • 7
  • 6
  • 4
  • 4
  • Tagged with
  • 946
  • 150
  • 125
  • 97
  • 86
  • 72
  • 61
  • 59
  • 58
  • 56
  • 56
  • 56
  • 55
  • 54
  • 52
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
411

A Quantitative Risk Optimization of Markowitz Model : An Empirical Investigation on Swedish Large Cap List

Bjärnbo, Oliver, Kheirollah, Amir January 2007 (has links)
This paper is an empirical study on Harry Markowitz work on Modern Portfolio Theory. The model introduced by him assumes the normality of assets’ return. We examined the OMX Large Cap List1 by mathematical and statistical methods for normality of assets’ returns. We studied the effect of the parameters, Skewness and Kurtosis for different time series data. We tried to figure it out which data series is better to construct a portfolio and how these extra parameters can make us better informed in our investments.
412

Andrahandsmarknaden för småhuspriser i Sverige : En tidsserieanalys av olika makrovariablers påverkan och samvariation för åren 1996- 2006

Comstedt, Wictor, Fredriksson, Robert January 2007 (has links)
Syftet med uppsatsen är att analysera hur ett antal makrovariabler påverkar på och samverkar med priserna på småhus. Även hur variablerna reagerar på chocker sinsemellan studeras. Tidsserieanalysen bygger på månadsdata från år 1996 fram till 2006, som innehåller alla försäljningar av småhusfastigheter i hela Sverige. Det kvalitetsindex som används för att analysera priserna är K/T- talet. För att studera skillnader, avseende samverkan och påverkan, regioner emellan görs en uppdelning av landet i regionerna Stockholm, Göteborg, Malmö, Glesbygd samt hela Sverige. För att testa olika makrovariablers påverkan på K/T- talen används Grangers kausalitets test. Resultaten av dessa test visar att K/T- talen påverkas på annorlunda sätt i de olika regionerna. Fortsatt analys sker med estimering av en enkel vektorautoregressiv modell för varje region, där de tre endogena variabler K/T- tal för regionen, industriordrar och nybilsregistrering används. Sedan utförs impulsrespons chocker på de tre endogena variablerna. Den starkaste effekten som kan urskiljas av dessa chocker är att K/T- talen har en inverkan på hushållens förmögenhet och då även dess konsumtion.
413

Forecasting GDP Growth : The Case of The Baltic States

Pilström, Patrick, Pohl, Sebastian January 2009 (has links)
The purpose of this thesis is to identify a general model to forecast GDP growth for the Baltic States, Estonia, Latvia and Lithuania. If the model provides reliable results for these states, then the model should be able to forecast GDP growth for other countries of interest. Forecasts are made by using a reduced vector autoregressive (VAR) model. The VAR models make use of past values of Gross Domestic Product-Inflation-Unemployment as explanatory variables. The performed forecasts have provided good results for horizons up to t+8. The forecasts for 2009 (t+12) are in line with those of several other actors. It is reasonable to assume that some of the forecasts for t+16 have reliable results. The Lithuanian forecast show a fall in GDP with 12.51 per cent in 2009 and a GDP growth of 4.23 per cent in 2010. The forecast for Estonia show that the GDP will decrease with 1.49 per cent in 2009 and 12.72 per cent in 2010. Finally the forecast for Latvia show a fall in GDP of 3.1 per cent in 2009 and 18 per cent in 2010. From the findings it is possible to conclude that the model provided reliable estimates of future levels of GDP for the Baltic States and the benchmark countries. This indicates that the model should be applicable on other countries of interest.
414

NIG distribution in modelling stock returns with assumption about stochastic volatility : Estimation of parameters and application to VaR and ETL.

Kucharska, Magdalena, Pielaszkiewicz, Jolanta January 2009 (has links)
We model Normal Inverse Gaussian distributed log-returns with the assumption of stochastic volatility. We consider different methods of parametrization of returns and following the paper of Lindberg, [21] we assume that the volatility is a linear function of the number of trades. In addition to the Lindberg’s paper, we suggest daily stock volumes and amounts as alternative measures of the volatility. As an application of the models, we perform Value-at-Risk and Expected Tail Loss predictions by the Lindberg’s volatility model and by our own suggested model. These applications are new and not described in the literature. For better understanding of our caluclations, programmes and simulations, basic informations and properties about the Normal Inverse Gaussian and Inverse Gaussian distributions are provided. Practical applications of the models are implemented on the Nasdaq-OMX, where we have calculated Value-at-Risk and Expected Tail Loss for the Ericsson B stock data during the period 1999 to 2004.
415

Tre Value at Risk modeller för riskvärdering av köpoptioner

Johansson, Andreas, Johansson, Daniel January 2007 (has links)
Riskvärdering har under 90-talet blivit ett allt mer medvetet begrepp. Ett populärt instrument vid riskvärdering är Value at Risk då denna modell skapar ett gemensamt riskmått för olika typer av portföljer och derivat. VaR mäter den maximala värdeförändringen för en portfölj där sannolikheten och tidshorisonten är förutbestämd. I uppsatsen har en konfidensnivå på 95 procent antagits vilket medför att de verkliga förlusterna ska överstiga VaR en gång av tjugo. Icke-linjära instrument, såsom optioner, är svåra att riskvärdera då dess pris förändras oproportionerligt gentemot dess underliggande. För att beräkna VaR kan flertalet modeller appliceras och dessa har olika egenskaper. Det är därför av intresse att ta reda på om Delta-Normal metoden, Monte Carlo simulering och Historisk simulering ger samma svar vid riskvärdering av optioner. Vidare syftar denna uppsats till att söka svar på om dessa tre VaR-modeller ger ett tillfredsställande resultat på 95 procentig konfidensnivå. För att få svar på dessa funderingar har vi i empiriavsnittet genomfört två hypotesprövningar. Den första slutsatsen som kan dras av undersökningen är att det inte går att skilja på det VaR som Delta-Normal metoden och Historisk simulering tagit fram. Vid ett hypotestest för proportioner blev resultatet att endast för Monte Carlo simuleringen kunde inte nollhypotesen förkastas. Detta innebär att det finns stöd för att de verkliga förlusterna överstiger Monte Carlo simuleringens beräknade VaR en gång av tjugo.
416

Aneuploidy compensatory mechanisms and genome-wide regulation of gene expression in Drosophila melanogaster

Lundberg, Lina January 2013 (has links)
Stimulation or repression of gene expression by genome-wide regulatory mechanisms is an important epigenetic regulatory function which can act to efficiently regulate larger regions or specific groups of genes, for example by compensating for loss or gain of chromosome copy numbers. In Drosophila melanogaster there are two known chromosome-wide regulatory systems; the MSL complex, which mediates dosage compensation of the single male X-chromosome and POF, which stimulates expression from the heterochromatic 4th chromosome. POF also interacts with the heterochromatin inducing protein HP1a, which represses expression from the 4th chromosome but which also has been assigned stimulatory functions. In addition to these two, there is another more elusive and less well-characterized genome-wide mechanism called buffering, which can act to balance transcriptional output of aneuploidy regions of the genome (i.e. copy number variation). In my thesis, I describe the presence of a novel physical link between dosage compensation and heterochromatin; mediate by two female-specific POF binding sites, proximal to roX1 and roX2 on the X chromosome (the two non-coding RNAs in the MSL complex). These sites can also provide clues to the mechanisms behind targeting of chromosome-specific proteins. Furthermore, to clarify the conflicting reports about the function of HP1a, I have suggested a mechanism in which HP1a has adopted its function to different genomic locations and gene types. Different binding mechanisms to the promoter vs. the exon of genes allows HP1a to adopt opposite functions; at the promoter, HP1a binding opens up the chromatin structure and stimulates gene expression, whereas the binding to exons condense the chromatin and thus, represses expression. This also causes long genes to be more bound and repressed by HP1a. Moreover, I show that buffering of monosomic regions is a weak but significant response to loss of chromosomal copy numbers, and that this is mediated via a general mechanism which mainly acts on differentially expressed genes, where the effect becomes stronger for long genes. I also show that POF is the factor which compensates for copy number loss of chromosome 4.
417

An empirical study in risk management: estimation of Value at Risk with GARCH family models

Nyssanov, Askar January 2013 (has links)
In this paper the performance of classical approaches and GARCH family models are evaluated and compared in estimation one-step-ahead VaR. The classical VaR methodology includes historical simulation (HS), RiskMetrics, and unconditional approaches. The classical VaR methods, the four univariate and two multivariate GARCH models with the Student’s t and the normal error distributions have been applied to 5 stock indices and 4 portfolios to determine the best VaR method. We used four evaluation tests to assess the quality of VaR forecasts: -                     Violation ratio -                     Kupiec’s test -                     Christoffersen’s test -                     Joint test The results point out that GARCH-based models produce far more accurate forecasts for both individual and portfolio VaR. RiskMetrics gives reliable VaR predictions but it is still substantially inferior to GARCH models. The choice of an optimal GARCH model depends on the individual asset, and the best model can be different based on different empirical data.
418

Moment Problems with Applications to Value-At-Risk and Portfolio Management

Tian, Ruilin 07 May 2008 (has links)
Moment Problems with Applications to Value-At-Risk and Portfolio Management By Ruilin Tian May 2008 Committee Chair: Dr. Samuel H. Cox Major Department: Risk Management and Insurance My dissertation provides new applications of moment theory and optimization to financial and insurance risk management. In the investment and managerial areas, one often needs to determine some measure of risk, especially the risk of extreme events. However, complete information of the underlying outcomes is usually unavailable; instead one has access to partial information such as the mean, variance, mode, or range. In Chapters 2 and 3, we find the semiparametric upper and lower bounds for the value-at-risk (VaR) with incomplete information, that is, moments of the underlying distribution. When a single variable is concerned, bounds on VaR are computed to obtain a 100% confidence interval. When the sample financial data have a global maximum, we show that unimodal assumption tightens the optimal bounds. Next we further analyze a function of two correlated random variables. Specifically, we find bounds on the probability of two joint extreme events. When three or more variables are involved, the multivariate problem can sometimes be converted to a single variable problem. In all cases, we use the physical measure rather than the commonly used equivalent pricing probability measure. In addition to solving these problems using the traditional approach based on the geometry of a moment problem, a more efficient method is proposed to solve a general class of moment bounds via semidefinite programming. In the last part of the thesis, we apply optimization techniques to improve financial portfolio risk management. Instead of considering VaR, we work with a coherent risk measure, the conditional VaR (CVaR). As an extension of Krokhmal et al. (2002), we impose CVaR-related functions to the portfolio selection problem. The CVaR approach sets a β-level CVaR as the objective function and maximizes the worst case on the tail of the distribution. The CVaR-like constraints approach adds a set of CVaR-like constraints to the traditional Markowitz problem, reshaping the portfolio distribution. Both methods greatly increase the skewness of portfolios, although the CVaR approach may lose control of the variance. This capability of increasing skewness is very attractive to the investors who may prefer higher probability of obtaining higher returns. We compare the CVaR-related approaches to some other popular portfolio optimization methods. Our numerical analysis provides empirical support for the superiority of the CVaR-like constraints approach in terms of portfolio efficiency.
419

Applying Value at Risk (VaR) analysis to Brent Blend Oil prices

Ali Mohamed, Khadar January 2011 (has links)
The purpose with this study is to compare four different models to VaR in terms of accuracy, namely Historical Simulation (HS), Simple Moving Average (SMA), Exponentially Weighted Moving Average (EWMA) and Exponentially Weighted Historical Simulation (EWHS). These VaR models will be applied to one underlying asset which is the Brent Blend Oil using these confidence levels 95 %, 99 % and 99, 9 %. Concerning the return of the asset the models under two different assumptions namely student t-distribution and normal distribution will be studied
420

Macroeconomic Shocks and Monetary Policy : Analysis of Sweden and the United Kingdom

Gajic, Ruzica January 2012 (has links)
External economic shocks cause domestic macroeconomic aggregates to fluctuate. This may call for a macroeconomic policy intervention. Since the early 1990s an increasing number of countries have adopted an inflation targeting framework. In reality, inflation targeters do not have perfect information when determining the interest rate in order to maintain their goal of price stability and stable economic growth. Therefore it is relevant to understand how shocks affect the domestic macroeconomic aggregates theoretically and investigate whether the theoretical predictions hold empirically. I use the New Keynesian model by Clarida, Galí and Gertler from 1999 and investigate explicitly the theoretical effects of expected and unexpected supply and demand-side shocks on the monetary policy instrument and the two monetary policy target variables – the interest rate, output gap and inflation rate. By analysing the impulse-response functions of a structural VAR model applied to quarterly Swedish and British data from 1994 to 2011, I test empirically the theoretical predictions according to the New Keynesian model. I find that the empirical results are in line with the theoretical predictions.

Page generated in 0.0263 seconds