Many mathematical programming models of the selection of investment portfolios assume that the best portfolio at any given level of risk is the portfolio having the highest level of return. The expected level of return is defined as a linear combination of the expected returns of the individual investments contained within the portfolio,and risk is defined in terms of variance of return. This study uses Monte Carlo simulation to establish that if the estimates of the future returns on potential investments are unbiased, the steady-state return on the portfolio is overestimated by the procedure used in the standard models. Under reasonable assumptions concerning the parameters of the estimates of the various returns, this bias is quite sizeable, with the steady-state predicted return often overestimating the steady-state actual return by more than ten percentage points. In addition, it is shown that when the variances of the alternative potential investments are not all equal,a limitation on the variance of the portfolio will reduce the magnitude of the bias. In many reasonable cases, constraining the portfolio variance reduces the bias by a magnitude greater than the amount by which it reduces the predicted portfolio return, causing the steady-state actual return to rise. This implies that return cannot automatically be assumed to be a monotonic function of risk.
Identifer | oai:union.ndltd.org:unt.edu/info:ark/67531/metadc500860 |
Date | 12 1900 |
Creators | Valentine, Jerome Lynn |
Contributors | Griffith, Reynolds, Cochran, Kendall P., Hays, Henry, Williams, Fredrik P. |
Publisher | North Texas State University |
Source Sets | University of North Texas |
Language | English |
Detected Language | English |
Type | Thesis or Dissertation |
Format | xvi, 178 leaves: ill., Text |
Rights | Public, Valentine, Jerome Lynn, Copyright, Copyright is held by the author, unless otherwise noted. All rights reserved. |
Page generated in 0.0017 seconds