Spelling suggestions: "subject:"eportfolio construction"" "subject:"aportfolio construction""
1 |
Constructing efficient portfolios : alternative models and discrete constraints - a computational studyHorniman, Michael David January 2001 (has links)
No description available.
|
2 |
A Comprehensive Portfolio Construction Under Stochastic EnvironmentElshahat, Ahmed 21 July 2008 (has links)
Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: a) increase the efficiency of the portfolio optimization process, b) implement large-scale optimizations, and c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH).
|
3 |
Examination of long-run performance of momentum portfolios: Implications for the sources and profitability of momentumLi, Yao 20 September 2019 (has links)
This dissertation investigates the long-term performance of momentum portfolios. Its results show striking asymmetries for winners and losers and imply potentially different causes for the winner and loser components of momentum.
After separately examining winners and losers relative to their respective benchmark portfolios with no momentum, we find winner momentum is smaller in magnitude, persists only for six months, and its higher return fully reverses. This is consistent with the notion that winner momentum is an overreaction to positive news and potentially destabilizing. Loser momentum is larger in magnitude, lasts for about one year, and its lower return does not reverse in the long run. This is consistent with the notion that loser momentum is an underreaction to negative news and suggests investors hold on to losers for too long.
The lack of reversal for losers departs from prior studies whose findings are driven by the use of monthly rebalanced portfolio. Rebalancing cumulates an upward bias caused by noise-induced price volatility, which disproportionately affects losers more. This greater upward bias in losers creates an illusion that the winner minus loser return reverses. More appropriate approaches such as the buy-and-hold portfolio documents significantly less reversal.
Existing theories that potentially conform to the overreaction of winners and underreaction of losers include overconfidence (Daniel, Hirshleifer, and Subrahmanyam, 1998), representativeness and conservatism (Barberis, Shleifer, and Vishny, 1998), interaction between agents holding asymmetric information (Hong and Stein, 1999), and investors' asymmetric response to fund performance (Vayanos and Woolley, 2013). / Doctor of Philosophy / The method employed to study a phenomenon can have an immense impact on our understanding. In the specific context of momentum - a strategy of buying stocks with good past performance and selling stocks with bad past performance, we show that the methodology choices as simple as how you form a portfolio and what you benchmark the portfolio against can produce significantly different results than previously documented. The documented pattern that momentum reverses over the long run, is confounded by the use of rebalanced portfolio and benchmarking winner and loser stocks against each other. Rebalancing embeds an upward bias into the return, with the bias increasing in the amount of noise in the price. Losers, having lower prices and smaller market values, suffer more from the upward bias. Thus, the reversal of the winner over loser return is more due to the inflated loser return from the bias than an underlying economic phenomenon. We confirm this by using a host of other portfolio formation methods that are known to mitigate the bias. With each of the other method, the loser return and the reversal are significantly reduced. We also suggest comparing winners and losers to a neutral benchmark with no momentum, rather than with each other to study the long-term pattern of momentum. This exercise turns out to be fruitful in that we find asymmetric behavior for winners and losers. Winner momentum fully reverses while loser momentum persists. The significance of the new results is that they affect our understanding of what drives momentum in the first place. The reversal of winners implies that winner momentum is an overreaction to positive news and potentially destabilizing. The persistence of losers indicates that loser momentum is an underreaction to negative news and implies investors hold on to losers for too long.
|
4 |
Robustní investiční portfolia / Robust Investment PortfoliosKonfršt, Zdeněk January 2014 (has links)
Robust Investment Portfolios Zdeněk Konfršt Abstract This master's thesis pursues the construction of stable, robust and growth portfo- lios in active portfolio management. These portfolios provide limited downside risks, short-time drawdowns and substantial growth prospects. We argue that the construc- tion of such portfolios is based on security selection as well as on portfolio theory (the Mean-Variance Model, MVM). The equity based portfolios were constructed and tested on real market data from the 1995-2014 period. The tested portfolios outperformed the S&P 500 out of and within the risk-reward ratio domain. Robust portfolios build on the MVM but they are less sensitive to errors of param- eters estimation. We experimented with several robust approaches and the results confirmed that the robust portfolios were less sensitive to outliers, less volatile and more stable (robust). The bottom-up process of security selection seems time consuming and labor intensive. Therefore we proposed an alternative approach to select stocks with so- called "cluster analysis". Generally, the cluster analysis classifies similar objects (companies) into similar groups. Technical and fundamental parameters of companies provided needed classification parameters. The classification was run over companies from the German DAX...
|
5 |
Portfolio Construction Using Hierarchical Clustering / Portfolio Construction Using Hierarchical ClusteringFučík, Vojtěch January 2017 (has links)
Hlavním cílem této práce je vyložit a zejména propojit existující metodologii filtrování korelačních matic, grafových algoritmů aplikovaných na minimální kostry grafu, hierarchického shlukování a analýzy hlavních komponent, pro vytvoření kvantitativních investičních strategií. Namísto tradičního použití časových řad akciových výnosů je užito reziduí z faktorových modelů. Tato rezidua jsou klíčovým vstupem pro všechny používané algoritmy k výpočtu pravděpodobnosti středovosti dané akcie. Pravděpodobnost středovosti je nekonvenční ukazatel pravděpodobnosti, kde hodnota blízko 1 značí vysokou pravděpodobnost středovosti dané akcie v dané ekonomické síti. Na základě této míry pravděpodobnosti je vybudováno několik investičních strategií, které jsou dále testován hlavních amerických akciových indexů. Nemůže být generalizováno, že periferní strategie dosahují konzistentně lepších výsledků než středové strategie. Zatímco při použití klasického Markowitzova optimalizačního procesu jsou zisky stabilní a potenciál průměrný, oba typy vybudovaných strategií (středové i periferní) sdílí vysoký potenciál zisku, který je ovšem vykoupen vysokou volatilitou.
|
6 |
A Study of Hierarchical Risk Parity in Portfolio ConstructionPalit, Debjani 05 1900 (has links)
Portfolio optimization is a process in which the capital is allocated among the portfolio assets such that the return on investment is maximized while the risk is minimized. Portfolio construction and optimization is a complex process and has been an active research area in finance for a long time. For the portfolios with highly correlated assets, the performance of traditional risk-based asset allocation methods such as, the mean-variance (MV) method is limited because it requires an inversion of the covariance matrix of the portfolio to distribute weight among the portfolio assets. Alternatively, a hierarchical clustering-based machine learning method can provide a possible solution to these limitations in portfolio construction because it uses hierarchical relationships between the covariance of assets in a portfolio to distribute the weight and an inversion of the covariance matrix is not required. A comparison of the performance and analyses of the difference in weight distribution of two optimization strategies, the traditional MV method and the hierarchical risk parity method (HRP), which is a machine learning method, on real price historical data has been performed. Also, a comparison of the performance of a simple non-optimization technique called the equal-weight (EW) method to the two optimization methods, the Mean-variance method and HRP method has also been performed. This research supports the idea that HRP is a feasible method to construct portfolios with correlated assets because the performance of HRP is comparable to the performances of the traditional optimization method and the non-optimization method.
|
7 |
Is Fine art a viable alternative investment? / Is Fine art a viable alternative investment?Thomas, Vincent January 2012 (has links)
This paper will study the Art market as an investment opportunity. We will forget about the artistic characteristics of the market (history of art, aesthetic, technic...) and focus only on the business and economic aspects of the market treating art works as tradable goods. Our goal will be to determine whether or not the art market would be a suitable investment vehicle, offering some interesting outlook to investment diversification. This paper will pay a closer look at the recent financial crisis period, trying to understand the mechanism which bonds the financial industry and the Art industry. This will be the key to introduce an investment portfolio including Art as an asset class for investment. Focusing on the performance of such portfolio we will give some further recommendation on how to reach a better than expected performance.
|
8 |
A Multi-Level Extension of the Hierarchical PCA Framework with Applications to Portfolio Construction with Futures Contracts / En flernivåsutbyggnad av ramverket för Hierarkisk PCA med tillämpningar på portföljallokering med terminskontraktBjelle, Kajsa January 2023 (has links)
With an increasingly globalised market and growing asset universe, estimating the market covariance matrix becomes even more challenging. In recent years, there has been an extensive development of methods aimed at mitigating these issues. This thesis takes its starting point in the recently developed Hierarchical Principal Component Analysis, in which a priori known information is taken into account when modelling the market correlation matrix. However, while showing promising results, the current framework only allows for fairly simple hierarchies with a depth of one. In this thesis, we introduce a generalisation of the framework that allows for an arbitrary hierarchical depth. We also evaluate the method in a risk-based portfolio allocation setting with Futures contracts. Furthermore, we introduce a shrinkage method called Hierarchical Shrinkage, which uses the hierarchical structure to further regularise the matrix. The proposed models are evaluated with respect to how well-conditioned they are, how well they predict eigenportfolio risk and portfolio performance when they are used to form the Minimum Variance Portfolio. We show that the proposed models result in sparse and easy-to-interpret eigenvector structures, improved risk prediction, lower condition numbers and longer holding periods while achieving Sharpe ratios that are at par with our benchmarks. / Med en allt mer globaliserad marknad och växande tillgångsuniversum blir det alltmer utmanande att uppskatta marknadskovariansmatrisen. Under senare år har det skett en omfattande utveckling av metoder som syftar till att mildra dessa problem. Detta examensarbete tar sin utgångspunkt i det nyligen utvecklade ramverket Hierarkisk Principalkomponentanalys, där kunskap känd sedan innan används för att modellera marknadskorrelationerna. Även om det visar lovande resultat så tillåter det nuvarande ramverket endast enkla hierarkier med ett djup på ett. I detta examensarbete introduceras en generalisering av detta ramverk, som tillåter ett godtyckligt hierarkiskt djup. Vi utvärderar också metoden i en riskbaserad portföljallokeringsmiljö med terminskontrakt. Vidare introducerar vi en krympningsmetod som vi kallar Hierarkisk Krympning. Hierarkisk krympning använder den hierarkiska strukturen för att ytterligare regularisera matrisen. De föreslagna modellerna av korrelationsmatrisen utvärderas med avseende på hur välkonditionerade de är, hur väl de förutsäger egenportföljrisk samt hur de presterar i portföljallokeringssyfte i en Minimum Variance portfölj. Vi visar att de introducerade modellerna resulterar i en gles och lätttolkad egenvektorstruktur, förbättrad riskprediktion, lägre konditionstal och längre hållperiod, samtidigt som portföljerna uppnår Sharpe-kvoter i linje med benchmarkmodellerna.
|
9 |
Dynamic portfolio construction and portfolio risk measurementMazibas, Murat January 2011 (has links)
The research presented in this thesis addresses different aspects of dynamic portfolio construction and portfolio risk measurement. It brings the research on dynamic portfolio optimization, replicating portfolio construction, dynamic portfolio risk measurement and volatility forecast together. The overall aim of this research is threefold. First, it is aimed to examine the portfolio construction and risk measurement performance of a broad set of volatility forecast and portfolio optimization model. Second, in an effort to improve their forecast accuracy and portfolio construction performance, it is aimed to propose new models or new formulations to the available models. Third, in order to enhance the replication performance of hedge fund returns, it is aimed to introduce a replication approach that has the potential to be used in numerous applications, in investment management. In order to achieve these aims, Chapter 2 addresses risk measurement in dynamic portfolio construction. In this chapter, further evidence on the use of multivariate conditional volatility models in hedge fund risk measurement and portfolio allocation is provided by using monthly returns of hedge fund strategy indices for the period 1990 to 2009. Building on Giamouridis and Vrontos (2007), a broad set of multivariate GARCH models, as well as, the simpler exponentially weighted moving average (EWMA) estimator of RiskMetrics (1996) are considered. It is found that, while multivariate GARCH models provide some improvements in portfolio performance over static models, they are generally dominated by the EWMA model. In particular, in addition to providing a better risk-adjusted performance, the EWMA model leads to dynamic allocation strategies that have a substantially lower turnover and could therefore be expected to involve lower transaction costs. Moreover, it is shown that these results are robust across the low - volatility and high-volatility sub-periods. Chapter 3 addresses optimization in dynamic portfolio construction. In this chapter, the advantages of introducing alternative optimization frameworks over the mean-variance framework in constructing hedge fund portfolios for a fund of funds. Using monthly return data of hedge fund strategy indices for the period 1990 to 2011, the standard mean-variance approach is compared with approaches based on CVaR, CDaR and Omega, for both conservative and aggressive hedge fund investors. In order to estimate portfolio CVaR, CDaR and Omega, a semi-parametric approach is proposed, in which first the marginal density of each hedge fund index is modelled using extreme value theory and the joint density of hedge fund index returns is constructed using a copula-based approach. Then hedge fund returns from this joint density are simulated in order to compute CVaR, CDaR and Omega. The semi-parametric approach is compared with the standard, non-parametric approach, in which the quantiles of the marginal density of portfolio returns are estimated empirically and used to compute CVaR, CDaR and Omega. Two main findings are reported. The first is that CVaR-, CDaR- and Omega-based optimization offers a significant improvement in terms of risk-adjusted portfolio performance over mean-variance optimization. The second is that, for all three risk measures, semi-parametric estimation of the optimal portfolio offers a very significant improvement over non-parametric estimation. The results are robust to as the choice of target return and the estimation period. Chapter 4 searches for improvements in portfolio risk measurement by addressing volatility forecast. In this chapter, two new univariate Markov regime switching models based on intraday range are introduced. A regime switching conditional volatility model is combined with a robust measure of volatility based on intraday range, in a framework for volatility forecasting. This chapter proposes a one-factor and a two-factor model that combine useful properties of range, regime switching, nonlinear filtration, and GARCH frameworks. Any incremental improvement in the performance of volatility forecasting is searched for by employing regime switching in a conditional volatility setting with enhanced information content on true volatility. Weekly S&P500 index data for 1982-2010 is used. Models are evaluated by using a number of volatility proxies, which approximate true integrated volatility. Forecast performance of the proposed models is compared to renowned return-based and range-based models, namely EWMA of Riskmetrics, hybrid EWMA of Harris and Yilmaz (2009), GARCH of Bollerslev (1988), CARR of Chou (2005), FIGARCH of Baillie et al. (1996) and MRSGARCH of Klaassen (2002). It is found that the proposed models produce more accurate out of sample forecasts, contain more information about true volatility and exhibit similar or better performance when used for value at risk comparison. Chapter 5 searches for improvements in risk measurement for a better dynamic portfolio construction. This chapter proposes multivariate versions of one and two factor MRSACR models introduced in the fourth chapter. In these models, useful properties of regime switching models, nonlinear filtration and range-based estimator are combined with a multivariate setting, based on static and dynamic correlation estimates. In comparing the out-of-sample forecast performance of these models, eminent return and range-based volatility models are employed as benchmark models. A hedge fund portfolio construction is conducted in order to investigate the out-of-sample portfolio performance of the proposed models. Also, the out-of-sample performance of each model is tested by using a number of statistical tests. In particular, a broad range of statistical tests and loss functions are utilized in evaluating the forecast performance of the variance covariance matrix of each portfolio. It is found that, in terms statistical test results, proposed models offer significant improvements in forecasting true volatility process, and, in terms of risk and return criteria employed, proposed models perform better than benchmark models. Proposed models construct hedge fund portfolios with higher risk-adjusted returns, lower tail risks, offer superior risk-return tradeoffs and better active management ratios. However, in most cases these improvements come at the expense of higher portfolio turnover and rebalancing expenses. Chapter 6 addresses the dynamic portfolio construction for a better hedge fund return replication and proposes a new approach. In this chapter, a method for hedge fund replication is proposed that uses a factor-based model supplemented with a series of risk and return constraints that implicitly target all the moments of the hedge fund return distribution. The approach is used to replicate the monthly returns of ten broad hedge fund strategy indices, using long-only positions in ten equity, bond, foreign exchange, and commodity indices, all of which can be traded using liquid, investible instruments such as futures, options and exchange traded funds. In out-of-sample tests, proposed approach provides an improvement over the pure factor-based model, offering a closer match to both the return performance and risk characteristics of the hedge fund strategy indices.
|
10 |
Hierarchical Clustering in Risk-Based Portfolio Construction / Hierarkisk klustring för riskbaserad portföljallokeringNanakorn, Natasha, Palmgren, Elin January 2021 (has links)
Following the global financial crisis, both risk-based and heuristic portfolio construction methods have received much attention from both academics and practitioners since these methods do not rely on the estimation of expected returns and as such are assumed to be more stable than Markowitz's traditional mean-variance portfolio. In 2016, Lopéz de Prado presented the Hierarchical Risk Parity (HRP), a new approach to portfolio construction which combines hierarchical clustering of assets with a heuristic risk-based allocation strategy in order to increase stability and improve out-of-sample performance. Using Monte Carlo simulations, Lopéz de Prado was able to demonstrate promising results. This thesis attempts to evaluate HRP using walk-forward analysis and historical data from equity index and bond futures, against more realistic benchmark methods and using additional performance measures relevant to practitioners. The main conclusion is that applying hierarchical clustering to risk-based portfolio construction does indeed improve the out-of-sample return and Sharpe ratio. However, the resulting portfolio is also associated with a remarkably high turnover, which may indicate numerical instability and sensitivity to estimation errors. It is also identified that Lopéz de Prado's original HRP approach has an undesirable property and alternative approaches to HRP have consequently been developed. Compared to Lopéz de Prado's original HRP approach, these alternative approaches increase the Sharpe ratio with ~10% and reduce the turnover with 60-65%. However, it should be noted that compared to more mainstream portfolios the turnover is still rather high, indicating that these alternative approaches to HRP are still somewhat unstable and sensitive to estimation errors. / Efter den globala finanskrisen har intresset för riskbaserade och heuristiska metoder för portföljallokering ökat inom såväl akademin som finansindustrin. Det ökade intresset grundar sig i att dessa metoder inte kräver estimering av förväntad avkastning och därför kan antas vara mer stabila än portföljer med grund i Markowitz moderna portföljteori. Lopéz de Prado presenterade 2016 en ny metod för portföljallokering, Hierarchical Risk Parity (HRP), som kombinerar hierarkisk klustring med en heuristisk riskbaserad portföljkonstruktion och vars syfte är att öka stabiliteten och förbättra avkastningen. Baserat på Monte Carlo-simuleringar har Lopéz de Prado lyckats påvisa lovande resultat. Syftet med detta examensarbete är att utvärdera HRP med hjälp av walk-forward-analys och empirisk data från aktieindex- och obligationsterminer. I denna utvärdering jämförs HRP med andra vanliga portföljmetoder med avseende på prestandamått relevanta för portföljförvaltare. Den huvudsakliga slutsatsen är att tillämpning av hierarkisk klustring inom ramen för riskbaserad portföljallokering förbättrar såväl den absoluta avkastningen som Sharpekvoten. Däremot är det tydligt att vikterna i en HRP-portfölj har hög omsättning över tid, vilket kan tyda på numerisk instabilitet och hög känslighet för skattningsfel. Vidare har en oönskad egenskap i Lopéz de Prados ursprungliga HRP-metod identifierats, varför två alternativa HRP-metoder har utvecklats inom ramen för examensarbetet. Jämfört med Lopéz de Prados ursprungliga metod förbättrar de två alternativa metoderna Sharpekvoten med 10% och minskar omsättningen av portföljvikterna med 60-65%. Det bör dock understrykas att även de nya metoderna har en förhållandevis hög omsättning, vilket tyder på att numerisk instabilitet och hög känslighet för skattningsfel till viss del fortfarande kvarstår.
|
Page generated in 0.1206 seconds