Spelling suggestions: "subject:"forecasting -- amathematical models"" "subject:"forecasting -- dmathematical models""
1 |
Bankruptcy : a proportional hazard approachBetton, Sandra Ann January 1987 (has links)
The recent dramatic increase in the corporate bankruptcy rate, coupled with a similar rate of increase in the bank failure rate, has re-awakened investor, lender and government interest in the area of bankruptcy prediction. Bankruptcy prediction models are of particular value to a firm's current and future creditors who often do not have the benefit of an actively traded market in the firm's securities from which to make inferences about the debtor's viability. The models commonly used by many experts in an endeavour to predict the possibility of disaster are outlined in this paper.
The proportional hazard model, pioneered by Cox [1972], assumes that the hazard function, the risk of failure, given failure has not already occurred, is a function of various explanatory variables and estimated coefficients multiplied by an arbitrary and unknown function of time. The Cox Proportional Hazard model is usually applied in medical studies; but, has recently been applied to the bank failure question [Lane, Looney & Wansley, 1986]. The model performed well in the narrowly defined, highly regulated, banking industry. The principal advantage of this approach is that the model incorporates both the survival times observed and any censoring of data thereby using more of the available information in the analysis. Unlike many bankruptcy prediction models, such as logit and probit based regression models, the Cox model estimates the probability distribution of survival times. The proportional hazard model would, therefore, appear to offer a useful addition to the more traditional bankruptcy prediction models mentioned above.
This paper evaluates the applicability of the Cox proportional hazard model in the more diverse industrial environment. In order to test this model, a sample of 109 firms was selected from the Compustat Industrial and Research Industrial data tapes. Forty one of these firms filed petitions under the various bankruptcy acts applicable between 1972 and 1985 and were matched to 67 firms which had not filed petitions for bankruptcy during the same period. In view of the dramatic changes in the bankruptcy regulatory environment caused by the Bankruptcy reform act of 1978, the legal framework of the bankruptcy process was also examined.
The performance of the estimated Cox model was then evaluated by comparing its classification and descriptive capabilities to those of an estimated discriminant analysis based model. The results of this study indicate that while the classification capability of the Cox model was less than that of discriminant analysis, the model provides additional information beyond that available from the discriminant analysis. / Business, Sauder School of / Graduate
|
2 |
Weather neutral models for short-term electricity demand forecastingNyulu, Thandekile January 2013 (has links)
Energy demand forecasting, and specifically electricity demand forecasting, is a fun-damental feature in both industry and research. Forecasting techniques assist all electricity market participants in accurate planning, selling and purchasing decisions and strategies. Generation and distribution of electricity require appropriate, precise and accurate forecasting methods. Also accurate forecasting models assist producers, researchers and economists to make proper and beneficial future decisions. There are several research papers, which investigate this fundamental aspect and attempt var-ious statistical techniques. Although weather and economic effects have significant influences on electricity demand, in this study they are purposely eliminated from investigation. This research considers calendar-related effects such as months of the year, weekdays and holidays (that is, public holidays, the day before a public holiday, the day after a public holiday, school holidays, university holidays, Easter holidays and major religious holidays) and includes university exams, general election days, day after elections, and municipal elections in the analysis. Regression analysis, cate-gorical regression and auto-regression are used to illustrate the relationships between response variable and explanatory variables. The main objective of the investigation was to build forecasting models based on this calendar data only and to observe how accurate the models can be without taking into account weather effects and economic effects, hence weather neutral models. Weather and economic factors have to be forecasted, and these forecasts are not so accurate and calendar events are known for sure (error-free). Collecting data for weather and economic factors is costly and time consuming, while obtaining calendar data is relatively easy.
|
3 |
A limited area primitive equation weather prediction model for Hong Kong陳鋈鋆, Chan, Yuk-kwan. January 1984 (has links)
published_or_final_version / Mathematics / Master / Master of Philosophy
|
4 |
The aliased and the de-aliased spectral models of the shallow water equationsUnknown Date (has links)
"The most widely used spectral models with the transform method are the de-aliased spectral model in which the de-aliased technique is used in the discrete Fourier transform according to the 3/2-rule. From the viewpoint of the Walsh-Hadamard transform, the multiplications of the values of the variables on the gridpoints do not yield the aliasing terms. In the shallow water equations, we compare the aliased spectral model with the de-aliased spectral model using the initial conditions of the Rossby-Haurwitz wave and the FGGE data. The aliased spectral models are more accurate and more efficient than the de-aliased spectral models. For the same wavenumber truncation, the computational amount of the aliased spectral model is only 60 percent of the de-aliased spectral model. We have not yet discovered the phenomenon of the nonlinear computational instability induced by the aliasing terms in the long time integration of the aliased spectral models. Thus, in the spectral models with the transform method the necessity of using the 3/2-rule in the discrete Fourier transform may be viewed with suspicion"--Abstract. / Typescript. / "Spring Semester, 1991." / "Submitted to the Department of Meteorology in partial fulfillment of the requirements for the degree of Master of Science." / Advisor: Richard L. Pfeffer, Professor Directing Thesis. / Includes bibliographical references (leaves 92-95).
|
5 |
A simulation approach to evaluate combining forecasts methods.January 1994 (has links)
by Ho Kwong-shing Lawrence. / Thesis (M.B.A.)--Chinese University of Hong Kong, 1994. / Includes bibliographical references (leaves 43-44). / ABSTRACT --- p.ii / TABLE OF CONTENTS --- p.iii / ACKNOWLEDGEMENT --- p.iv / CHAPTER / Chapter I. --- INTRODUCTION AND LITERATURE REVIEW --- p.1 / Chapter II. --- COMBINING SALES FORECASTS --- p.7 / Chapter III. --- EXPERIMENTAL DESIGN --- p.14 / Chapter IV. --- SIMULATION RESULTS --- p.19 / Chapter V. --- SUMMARY AND CONCLUSION --- p.27 / APPENDIX --- p.31 / BIBLIOGRAPHY --- p.43
|
6 |
Using bootstrap in capture-recapture model.January 2001 (has links)
Yung Wun Na. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2001. / Includes bibliographical references (leaves 60-62). / Abstracts in English and Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 2 --- Statistical Modeling --- p.4 / Chapter 2.1 --- Capture Recapture Model --- p.4 / Chapter 2.1.1 --- Petersen Estimate --- p.5 / Chapter 2.1.2 --- Chapman Estimate --- p.8 / Chapter 2.2 --- The Bootstrap Method --- p.9 / Chapter 2.2.1 --- The Bootstrap Percentile Method --- p.10 / Chapter 2.3 --- The Double Bootstrap Method --- p.12 / Chapter 2.3.1 --- The Robbins-Monro Method --- p.12 / Chapter 2.3.2 --- Confidence Interval generated by the Robbins-Monro Method --- p.13 / Chapter 2.3.3 --- Three Different Approaches --- p.16 / Chapter 3 --- Empirical Study --- p.19 / Chapter 3.1 --- Introduction --- p.19 / Chapter 3.2 --- Double Bootstrap Method --- p.20 / Chapter 3.2.1 --- Petersen Estimate --- p.20 / Chapter 3.2.2 --- Chapman Estimate --- p.27 / Chapter 3.2.3 --- Comparison of Petersen and Chapman Estimates --- p.31 / Chapter 3.3 --- Conclusion --- p.33 / Chapter 4 --- Simulation Study --- p.35 / Chapter 4.1 --- Introduction --- p.35 / Chapter 4.2 --- Simulation Results of Double Bootstrap Method --- p.36 / Chapter 5 --- Conclusion and Discussion --- p.52 / References --- p.60
|
7 |
Simultaneous prediction intervals for multiplicative Holt-Winters forecasting procedure.January 2006 (has links)
Wong Yiu Kei. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2006. / Includes bibliographical references (leaves 68-70). / Abstracts in English and Chinese. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- The Importance of Multiple Forecasting and Examples --- p.2 / Chapter 1.2 --- Previous Literature on Prediction Interval and Simultaneous Prediction Interval --- p.3 / Chapter 1.3 --- Objectives --- p.7 / Chapter 2 --- The Holt-Winters forecasting procedure --- p.8 / Chapter 2.1 --- Exponential Smoothing --- p.8 / Chapter 2.2 --- Holt-Winters Forecasting Procedure --- p.10 / Chapter 2.2.1 --- Additive Holt-Winters Model --- p.14 / Chapter 2.2.2 --- Multiplicative Holt-Winters Model --- p.16 / Chapter 2.3 --- Some Practical Issues --- p.18 / Chapter 2.3.1 --- Choosing Starting Values --- p.19 / Chapter 2.3.2 --- Choosing the Smoothing Parameters --- p.20 / Chapter 3 --- Constructing Simultaneous Prediction Intervals Method --- p.23 / Chapter 3.1 --- Bonferroni Procedure --- p.24 / Chapter 3.2 --- The 'Exact' Procedure --- p.25 / Chapter 3.3 --- Summary --- p.25 / Chapter 3.4 --- Covariance of forecast errors --- p.26 / Chapter 3.4.1 --- Yar and Chatfield's approach --- p.26 / Chapter 3.4.2 --- "Koehler, Snyder and Ord Approach" --- p.28 / Chapter 3.5 --- Simulation Study --- p.31 / Chapter 4 --- An Illustrative Example --- p.37 / Chapter 5 --- Simulation --- p.56 / Chapter 5.1 --- Conclusion --- p.62 / Appendix --- p.64 / References --- p.68
|
8 |
Margin variations in support vector regression for the stock market prediction.January 2003 (has links)
Yang, Haiqin. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2003. / Includes bibliographical references (leaves 98-109). / Abstracts in English and Chinese. / Abstract --- p.ii / Acknowledgement --- p.v / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Time Series Prediction and Its Problems --- p.1 / Chapter 1.2 --- Major Contributions --- p.2 / Chapter 1.3 --- Thesis Organization --- p.3 / Chapter 1.4 --- Notation --- p.4 / Chapter 2 --- Literature Review --- p.5 / Chapter 2.1 --- Framework --- p.6 / Chapter 2.1.1 --- Data Processing --- p.8 / Chapter 2.1.2 --- Model Building --- p.10 / Chapter 2.1.3 --- Forecasting Procedure --- p.12 / Chapter 2.2 --- Model Descriptions --- p.13 / Chapter 2.2.1 --- Linear Models --- p.15 / Chapter 2.2.2 --- Non-linear Models --- p.17 / Chapter 2.2.3 --- ARMA Models --- p.21 / Chapter 2.2.4 --- Support Vector Machines --- p.23 / Chapter 3 --- Support Vector Regression --- p.27 / Chapter 3.1 --- Regression Problem --- p.27 / Chapter 3.2 --- Loss Function --- p.29 / Chapter 3.3 --- Kernel Function --- p.34 / Chapter 3.4 --- Relation to Other Models --- p.36 / Chapter 3.4.1 --- Relation to Support Vector Classification --- p.36 / Chapter 3.4.2 --- Relation to Ridge Regression --- p.38 / Chapter 3.4.3 --- Relation to Radial Basis Function --- p.40 / Chapter 3.5 --- Implemented Algorithms --- p.40 / Chapter 4 --- Margins in Support Vector Regression --- p.46 / Chapter 4.1 --- Problem --- p.47 / Chapter 4.2 --- General ε-insensitive Loss Function --- p.48 / Chapter 4.3 --- Accuracy Metrics and Risk Measures --- p.52 / Chapter 5 --- Margin Variation --- p.55 / Chapter 5.1 --- Non-fixed Margin Cases --- p.55 / Chapter 5.1.1 --- Momentum --- p.55 / Chapter 5.1.2 --- GARCH --- p.57 / Chapter 5.2 --- Experiments --- p.58 / Chapter 5.2.1 --- Momentum --- p.58 / Chapter 5.2.2 --- GARCH --- p.65 / Chapter 5.3 --- Discussions --- p.72 / Chapter 6 --- Relation between Downside Risk and Asymmetrical Margin Settings --- p.77 / Chapter 6.1 --- Mathematical Derivation --- p.77 / Chapter 6.2 --- Algorithm --- p.81 / Chapter 6.3 --- Experiments --- p.83 / Chapter 6.4 --- Discussions --- p.86 / Chapter 7 --- Conclusion --- p.92 / Chapter A --- Basic Results for Solving SVR --- p.94 / Chapter A.1 --- Dual Theory --- p.94 / Chapter A.2 --- Standard Method to Solve SVR --- p.96 / Bibliography --- p.98
|
9 |
The consolidation of forecests with regression modelsVenter, Daniel Jacobus Lodewyk January 2014 (has links)
The primary objective of this study was to develop a dashboard for the consolidation of multiple forecasts utilising a range of multiple linear regression models. The term dashboard is used to describe with a single word the characteristics of the forecasts consolidation application that was developed to provide the required functionalities via a graphical user interface structured as a series of interlinked screens. Microsoft Excel© was used as the platform to develop the dashboard named ConFoRM (acronym for Consolidate Forecasts with Regression Models). The major steps of the consolidation process incorporated in ConFoRM are: 1. Input historical data. Select appropriate analysis and holdout samples. 3. Specify regression models to be considered as candidates for the final model to be used for the consolidation of forecasts. 4. Perform regression analysis and holdout analysis for each of the models specified in step 3. 5. Perform post-holdout testing to assess the performance of the model with best holdout validation results on out-of-sample data. 6. Consolidate forecasts. Two data transformations are available: the removal of growth and time-periods effect from the time series; a translation of the time series by subtracting ̅i, the mean of all the forecasts for data record i, from the variable being predicted and its related forecasts for each data record I. The pre-defined regression models available for ordinary least square linear regression models (LRM) are: a. A set of k simple LRM’s, one for each of the k forecasts; b. A multiple LRM that includes all the forecasts: c. A multiple LRM that includes all the forecasts and as many of the first-order interactions between the input forecasts as allowed by the sample size and the maximum number of predictors provided by the dashboard with the interactions included in the model to be those with the highest individual correlation with the variable being predicted; d. A multiple LRM that includes as many of the forecasts and first-order interactions between the input forecasts as allowed by the sample size and the maximum number of predictors provided by the dashboard: with the forecasts and interactions included in the model to be those with the highest individual correlation with the variable being predicted; e. A simple LRM with the predictor variable being the mean of the forecasts: f. A set of simple LRM’s with the predictor variable in each case being the weighted mean of the forecasts with different formulas for the weights Also available is an ad hoc user specified model in terms of the forecasts and the predictor variables generated by the dashboard for the pre-defined models. Provision is made in the regression analysis for both of forward entry and backward removal regression. Weighted least squares (WLS) regression can be performed optionally based on the age of forecasts with smaller weight for older forecasts.
|
10 |
Housing demand : an empirical intertemporal modelSchwann, Gregory Michael January 1987 (has links)
I develop an empirical model of housing demand which is based as closely as possible on a theoretical
intertemporal model of consumer demand. In the empirical model, intertemporal behavior by households is incorporated in two ways. First, a household's expected length of occupancy in a dwelling is a parameter in the model; thus, households are able to choose when to move. Second, a household's decision to move and its choice of dwelling are based on the same intertemporal utility function. The parameters of the utility function are estimated using a switching regresion model in which the decision to move and the choice of housing quantity are jointly determined. The model has four other features: (1) a characteristics approach to housing demand is taken, (2) the transaction costs of changing dwellings are incorporated in the model, (3) sample data on household
mortgages are employed in computing the user cost of owned dwellings, and (4) demographic variables are incorporated systematically into the household utility function.
Rosen's two step proceedure is used to estimate the model. Cragg's technique for estimating regressions in the presence of heteroscedasticity of unknown form is used to estimate the hedonic regressions in step one of the proceedure. In the second step, the switching regression model, is estimated by maximum likelihood.
A micro data set of 2,513 Canadian households is used in the estimations.
The stage one hedonic regressions indicate that urban housing markets are not in long run equilibrium,
that the errors of the hedonic regressions are heteroscedastic, and that simple functional forms for hedonic regressions may perform as well as more complex forms. The stage two estimates
establish that a tight link between the theoretical and empirical models of housing demand produces a better model. My results show that conventional static models of housing demand are misspecified. They indicate that households have vastly different planned lengths of dwelling occupancy.
They also indicate that housing demand is determined to a great extent by demographic factors. / Arts, Faculty of / Vancouver School of Economics / Graduate
|
Page generated in 0.1439 seconds