121 |
The use of attribute analysis to improve volume forecastingLevine, Gregory David 12 September 2012 (has links)
M.Comm. / The objectives of this research can be summarised as follows: To integrate the customer research technique of attribute analysis with the volume planning process. To determine whether attribute analysis, is a useful and reliable tool for determining purchase behavior in the small sedan segment of the motor industry. To determine a customers perception of total product value through attribute analysis. To analyse the influence of price in the purchase decision. To determine whether a market share theorem that weights the perceived value of a vehicle, relative to its competitors perceived value is a reliable predictor of market penetration. To identify the determinant attributes in the decision making process of purchasing a small sedan vehicle. The research component of this dissertation is limited to the small sedan passenger segment of South African car market. While it is recognised that the small sedan segment is only one sub-sector of the vehicle market, it is felt that the study will provide a fair reflection of how the approach could be utilised in other sectors.
|
122 |
Análise de bolhas imobiliárias ao redor do mundoBarbosa, Guilherme Valentim 10 August 2018 (has links)
Submitted by Guilherme Valentim Barbosa (guilhermevalentim@gmail.com) on 2018-09-18T02:06:09Z
No. of bitstreams: 1
Guilherme Valentim Barbosa - Dissertação - 20180917.pdf: 947230 bytes, checksum: a6fd42e0aefd410304cfa0f0ba723d7b (MD5) / Approved for entry into archive by Joana Martorini (joana.martorini@fgv.br) on 2018-09-18T14:55:42Z (GMT) No. of bitstreams: 1
Guilherme Valentim Barbosa - Dissertação - 20180917.pdf: 947230 bytes, checksum: a6fd42e0aefd410304cfa0f0ba723d7b (MD5) / Approved for entry into archive by Suzane Guimarães (suzane.guimaraes@fgv.br) on 2018-09-19T13:06:39Z (GMT) No. of bitstreams: 1
Guilherme Valentim Barbosa - Dissertação - 20180917.pdf: 947230 bytes, checksum: a6fd42e0aefd410304cfa0f0ba723d7b (MD5) / Made available in DSpace on 2018-09-19T13:06:39Z (GMT). No. of bitstreams: 1
Guilherme Valentim Barbosa - Dissertação - 20180917.pdf: 947230 bytes, checksum: a6fd42e0aefd410304cfa0f0ba723d7b (MD5)
Previous issue date: 2018-08-10 / Este trabalho busca analisar empiricamente a existência de bolhas imobiliárias ao redor do mundo e identificar quando esses comportamentos explosivos no preço dos imóveis ocorreram. Os resultados foram obtidos por meio de uma metodologia recursiva de testes de raiz unitária, os testes SADF e GSADF propostos por Phillips e co-autores. Foram coletados dados de preços de imóveis para 28 países e seus respectivos índices de preços ao consumidor. Os resultados obtidos apontaram a existência de comportamentos explosivos em aproximadamente 90% das séries analisadas. / This study aims to empirically analyze the existence of real estate bubbles around the world and to identify when these explosive behavior in real estate prices occurred. The results were obtained through a recursive methodology of unit root tests, the SADF and GSADF tests proposed by Phillips and co-authors. Real estate price data were collected for 28 countries and their respective consumer price indexes. The results obtained indicate the existence of explosive behavior in about 90% of the analyzed series.
|
123 |
Do Predictions of Professional Business Economists Conform to the Rational Expectations Hypothesis?: Tests on a Set of Survey DataDabbs, Russell Edward 08 1900 (has links)
A set of forecast survey data is analyzed in this paper for properties consistent with the Rational Expectations Hypothesis. Standard statistical tests for "rational expectations" are employed utilizing consensus forecasts generated by an interest rate newsletter. Four selected variables (Fed Funds rate, M1 rate of growth, rate of change in CPI, and real GNP growth rate) are analyzed over multiple time horizons. Results tend to reject "rational expectations" for most variables and time horizons. Forecasts are more likely to meet "rationality" criteria the shorter the forecast horizon, with the notable exception of forecasts of real GNP growth.
|
124 |
Predicting Failure in the Savings and Loan Industry: a Comparison of RAP and GAAP AccountingKenny, Sara York 12 1900 (has links)
The financial crisis facing the United States savings and loan industry has been steadily escalating over the last decade. During this time, accounting treatments concerning various thrift institution transactions have also attracted a great deal of attention. The specialized accounting treatments used in the thrift industry, known as regulatory accounting practices (RAP) have been blamed as one of the culprits hindering the regulators' ability to detect serious financial problems within many institutions. Accordingly, RAP was phased out, and all federally insured savings and loan associations began preparing their financial statements in accordance with generally accepted accounting principles (GAAP) as of January 1, 1989. The purpose of this dissertation is to compare the relative predictive values of the two historical cost based accounting conventions (RAP and GAAP) available to the savings and loar? industry during the 1980's. For purposes of this dissertation, predictive value is defined as the usefulness in assessing future financial health and viability. The sample consisted of all the institutions reporting to the Federal Home Loan Bank of Dallas between 1984 and 1989. Year-end thrift financial report data, obtained from Sheshunoff Information Services, Inc. (Austin, Texas) was used to calculate several financial ratios. The Federal Home Loan Bank of Dallas provided a comprehensive listing of all institutions that failed between January 1, 1985 and March 31, 1989. The null hypothesis tested in this study was: no significant differences existed between the predictive values of RAP and GAAP financial statements. Using a dichotomous dependent variable (failed/not failed) and independent variables from prior research, several multinomial logistic models were developed to test the null hypothesis. All models developed failed to reject the null hypothesis.
|
125 |
An analysis of economic complexity and selected macroeconomic indicators in selected SSA and BRICS countries : panel data analysisMolele, Sehludi Brian January 2022 (has links)
Thesis (Ph.D. (Economics)) -- University of Limpopo, 2022 / This study investigated the relationship between economic complexity and the three mac-roeconomic variables in a comparative setting between selected Sub-Saharan African (SSA) and BRICS countries. Economic complexity as a development index reveals how sophisticated a country is as shown by its exports structure through the Product Com-plexity Index (PCI) and Economic Complexity Index (ECI). The three macroeconomic var-iables are gross domestic product per capita (GDP per capita), current account and fixed investment (gross fixed capita formation) for the period 1994 to 2018.The first three set study objectives were investigated on whether there exists a short and long-run relation-ship through a Panel Autoregressive Distributed Lag (PARDL). The the fourth objective was to test for causality through a standard Granger causality, and fifth, to forecast the macroeconomic variables for the foreseeable future utilising the Impulse Response Func-tion (IRF) and the variance decomposition techniques, these are complementary tech-niques. The last two objectives were to draw a comparative analysis upon the findings, and to relate on the product complexities and economic landscape in the selected SSA and BRICS. Reporting on the ECI-GDP per capita nexus, the PARDL estimates revealed a positive and significant association between ECI and GDP per capita in both the se-lected SSA and BRICS in the long-run. There was no Granger causal effect between ECI and GDP per capita for both set of countries. The concern was in relation to forecasting GDP per capita due to a shock in ECI. The selected SSA GDP per capita response to a shock in ECI was neutral when adopting the IRF technique, and the variance decompo-sition also revealed small estimates in both the short and long-run, below 1%. In the BRICS economies, there was a meaningful positive reaction from a shock in ECI when deploying the IRF technique, while the variance decomposition had a 3% response in the long run when seen through the variance decomposition.
On the current account-ECI relationship, the PARDL estimates exposed that there was a positive and significant impact from ECI on the current account in both the groups in the long-run significant while short-run results were insignificant. Granger causality could not detect any causal effect between ECI and current account in the selected SSA, while in the BRICS countries there was a unidirectional causal effect from ECI to current account. When forecasting the current account, the selected SSA reacted negatively to a shock in
v
ECI seen through the IRF, and the variance decomposition also revealed a small reaction in any period. In the BRICS case, current account’s response was a positive and explo-sive reaction from a shock in ECI when applying the IRF technique. The VD revealed a higher change in current account was explained by a shock in ECI. On the ECI-Fixed Investment, the PARDL estimates showed that there was a long-run positive and signifi-cant effect between ECI and fixed investment in bothgroups. However, the Granger causal results revealed no presence of causality in the selected SSA, while there was causal unidirectional effect from ECI to fixed investment. The IRF technique revealed a negative fixed investment reaction from a shock in ECI, and the variance decomposition results revealed a small reaction in fixed investment in the selected SSA. In the BRICS case, there was a positive and explosive fixed investment emanating from a shock in ECI. Utilising the variance decomposition fixed investment in BRICS was explained by inno-vative shocks in ECI in the long run.
On the last two objectives, comparatively the selected SSA countries are disadvantaged as they are concentrated in negative ECI as seen in the descriptive statistics, reflecting that they are still much less developed. This tells us that they are less industrialised as compared to the BRICS nations who are better off. These selected SSA economies are not developed enough as compared to the BRICS nations. The SSA region needs to learn from the leading BRICS countries by creating a conducive environment for a better de-velopment of innovation that improves the domestic value chain that produces knowledge-based products for the export market. The rest of the selected SSA region should form part of economic integrations with the more developed countries that offer mutual beneficiation like South Africa to fast track the developmental of their states. There is a need to modernise the agricultural and agro-industries. The region should harness the full potential of its agricultural sector. This will create a large global market share and perhaps increase the current account outlook through trade with more efficient agro-pro-cessed products. Africa needs to scale up investment in many fronts from government to private investment to improve infrastructure, more so that the scale of needs is so much in the continent.
|
126 |
Forecasting annual tax revenue of the South African taxes using time series Holt-Winters and ARIMA/SARIMA ModelsMakananisa, Mangalani P. 10 1900 (has links)
This study uses aspects of time series methodology to model and forecast major taxes such as Personal Income Tax (PIT), Corporate Income Tax (CIT), Value Added Tax (VAT) and Total Tax Revenue(TTAXR) in the South African Revenue Service (SARS).
The monthly data used for modeling tax revenues of the major taxes was drawn from January 1995 to March 2010 (in sample data) for PIT, VAT and TTAXR. Due to higher volatility and emerging negative values, the CIT monthly data was converted to quarterly data from the rst quarter of 1995 to the rst quarter of 2010. The competing ARIMA/SARIMA and Holt-Winters models were derived, and the resulting model of this study was used to forecast PIT, CIT, VAT and TTAXR for SARS fiscal years 2010/11, 2011/12 and 2012/13. The results show that both the SARIMA and Holt-Winters models perform well in modeling and forecasting PIT and VAT, however the Holt-Winters model outperformed the SARIMA model in modeling and forecasting the more volatile CIT and TTAXR. It is recommended that these methods are used in forecasting future payments, as they are precise about forecasting tax revenues, with minimal errors and fewer model revisions being necessary. / Statistics / M.Sc. (Statistics)
|
127 |
Forecasting annual tax revenue of the South African taxes using time series Holt-Winters and ARIMA/SARIMA ModelsMakananisa, Mangalani P. 10 1900 (has links)
This study uses aspects of time series methodology to model and forecast major taxes such as Personal Income Tax (PIT), Corporate Income Tax (CIT), Value Added Tax (VAT) and Total Tax Revenue(TTAXR) in the South African Revenue Service (SARS).
The monthly data used for modeling tax revenues of the major taxes was drawn from January 1995 to March 2010 (in sample data) for PIT, VAT and TTAXR. Due to higher volatility and emerging negative values, the CIT monthly data was converted to quarterly data from the rst quarter of 1995 to the rst quarter of 2010. The competing ARIMA/SARIMA and Holt-Winters models were derived, and the resulting model of this study was used to forecast PIT, CIT, VAT and TTAXR for SARS fiscal years 2010/11, 2011/12 and 2012/13. The results show that both the SARIMA and Holt-Winters models perform well in modeling and forecasting PIT and VAT, however the Holt-Winters model outperformed the SARIMA model in modeling and forecasting the more volatile CIT and TTAXR. It is recommended that these methods are used in forecasting future payments, as they are precise about forecasting tax revenues, with minimal errors and fewer model revisions being necessary. / Statistics / M.Sc. (Statistics)
|
128 |
Prévisions des importations des pays développés capitalistes en provenance du monde sous-développéKestens, Paul January 1968 (has links)
Doctorat en sciences sociales, politiques et économiques / info:eu-repo/semantics/nonPublished
|
129 |
Essays on the economics of risk and uncertaintyBerger, Loïc 22 June 2012 (has links)
In the first chapter of this thesis, I use the smooth ambiguity model developed by Klibanoff, Marinacci, and Mukerji (2005) to define the concepts of ambiguity and uncertainty premia in a way analogous to what Pratt (1964) did in the risk theory literature. I show that these concepts may be useful to quantify the effect ambiguity has on the welfare of economic agents. I also define several other concepts such as the unambiguous probability equivalent or the ambiguous utility premium, provide local approximations of these different premia and show the link that exists between them when comparing different degrees of ambiguity aversion not only in the small, but also in the large. <p><p>In the second chapter, I analyze the effect of ambiguity on self-insurance and self-protection, that are tools used to deal with the uncertainty of facing a monetary loss when market insurance is not available (in the self-insurance model, the decision maker has the opportunity to furnish an effort to reduce the size of the loss occurring in the bad state of the world, while in the self-protection – or prevention – model, the effort reduces the probability of being in the bad state). <p>In a short note, in the context of a two-period model I first examine the links between risk-aversion, prudence and self-insurance/self-protection activities under risk. Contrary to the results obtained in the static one-period model, I show that the impacts of prudence and of risk-aversion go in the same direction and generate a higher level of prevention in the more usual situations. I also show that the results concerning self-insurance in a single period framework may be easily extended to a two-period context. <p>I then consider two-period self-insurance and self-protection models in the presence of ambiguity and analyze the effect of ambiguity aversion. I show that in most common situations, ambiguity prudence is a sufficient condition to observe an increase in the level of effort. I propose an interpretation of the model in the context of climate change, so that self-insurance and self-protection are respectively seen as adaptation and mitigation efforts a policy-maker should provide to deal with an uncertain catastrophic event, and interpret the results obtained as an expression of the Precautionary Principle. <p><p>In the third chapter, I introduce the economic theory developed to deal with ambiguity in the context of medical decision-making. I show that, under diagnostic uncertainty, an increase in ambiguity aversion always leads a physician whose goal is to act in the best interest of his patient, to choose a higher level of treatment. In the context of a dichotomic choice (treatment versus no treatment), this result implies that taking into account the attitude agents generally manifest towards ambiguity may induce a physician to change his decision by opting for treatment more often. I further show that under therapeutic uncertainty, the opposite happens, i.e. an ambiguity averse physician may eventually choose not to treat a patient who would have been treated under ambiguity neutrality. <p> / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
|
130 |
Essays in dynamic macroeconometricsBañbura, Marta 26 June 2009 (has links)
The thesis contains four essays covering topics in the field of macroeconomic forecasting.<p><p>The first two chapters consider factor models in the context of real-time forecasting with many indicators. Using a large number of predictors offers an opportunity to exploit a rich information set and is also considered to be a more robust approach in the presence of instabilities. On the other hand, it poses a challenge of how to extract the relevant information in a parsimonious way. Recent research shows that factor models provide an answer to this problem. The fundamental assumption underlying those models is that most of the co-movement of the variables in a given dataset can be summarized by only few latent variables, the factors. This assumption seems to be warranted in the case of macroeconomic and financial data. Important theoretical foundations for large factor models were laid by Forni, Hallin, Lippi and Reichlin (2000) and Stock and Watson (2002). Since then, different versions of factor models have been applied for forecasting, structural analysis or construction of economic activity indicators. Recently, Giannone, Reichlin and Small (2008) have used a factor model to produce projections of the U.S GDP in the presence of a real-time data flow. They propose a framework that can cope with large datasets characterised by staggered and nonsynchronous data releases (sometimes referred to as “ragged edge”). This is relevant as, in practice, important indicators like GDP are released with a substantial delay and, in the meantime, more timely variables can be used to assess the current state of the economy.<p><p>The first chapter of the thesis entitled “A look into the factor model black box: publication lags and the role of hard and soft data in forecasting GDP” is based on joint work with Gerhard Rünstler and applies the framework of Giannone, Reichlin and Small (2008) to the case of euro area. In particular, we are interested in the role of “soft” and “hard” data in the GDP forecast and how it is related to their timeliness.<p>The soft data include surveys and financial indicators and reflect market expectations. They are usually promptly available. In contrast, the hard indicators on real activity measure directly certain components of GDP (e.g. industrial production) and are published with a significant delay. We propose several measures in order to assess the role of individual or groups of series in the forecast while taking into account their respective publication lags. We find that surveys and financial data contain important information beyond the monthly real activity measures for the GDP forecasts, once their timeliness is properly accounted for.<p><p>The second chapter entitled “Maximum likelihood estimation of large factor model on datasets with arbitrary pattern of missing data” is based on joint work with Michele Modugno. It proposes a methodology for the estimation of factor models on large cross-sections with a general pattern of missing data. In contrast to Giannone, Reichlin and Small (2008), we can handle datasets that are not only characterised by a “ragged edge”, but can include e.g. mixed frequency or short history indicators. The latter is particularly relevant for the euro area or other young economies, for which many series have been compiled only since recently. We adopt the maximum likelihood approach which, apart from the flexibility with regard to the pattern of missing data, is also more efficient and allows imposing restrictions on the parameters. Applied for small factor models by e.g. Geweke (1977), Sargent and Sims (1977) or Watson and Engle (1983), it has been shown by Doz, Giannone and Reichlin (2006) to be consistent, robust and computationally feasible also in the case of large cross-sections. To circumvent the computational complexity of a direct likelihood maximisation in the case of large cross-section, Doz, Giannone and Reichlin (2006) propose to use the iterative Expectation-Maximisation (EM) algorithm (used for the small model by Watson and Engle, 1983). Our contribution is to modify the EM steps to the case of missing data and to show how to augment the model, in order to account for the serial correlation of the idiosyncratic component. In addition, we derive the link between the unexpected part of a data release and the forecast revision and illustrate how this can be used to understand the sources of the<p>latter in the case of simultaneous releases. We use this methodology for short-term forecasting and backdating of the euro area GDP on the basis of a large panel of monthly and quarterly data. In particular, we are able to examine the effect of quarterly variables and short history monthly series like the Purchasing Managers' surveys on the forecast.<p><p>The third chapter is entitled “Large Bayesian VARs” and is based on joint work with Domenico Giannone and Lucrezia Reichlin. It proposes an alternative approach to factor models for dealing with the curse of dimensionality, namely Bayesian shrinkage. We study Vector Autoregressions (VARs) which have the advantage over factor models in that they allow structural analysis in a natural way. We consider systems including more than 100 variables. This is the first application in the literature to estimate a VAR of this size. Apart from the forecast considerations, as argued above, the size of the information set can be also relevant for the structural analysis, see e.g. Bernanke, Boivin and Eliasz (2005), Giannone and Reichlin (2006) or Christiano, Eichenbaum and Evans (1999) for a discussion. In addition, many problems may require the study of the dynamics of many variables: many countries, sectors or regions. While we use standard priors as proposed by Litterman (1986), an<p>important novelty of the work is that we set the overall tightness of the prior in relation to the model size. In this we follow the recommendation by De Mol, Giannone and Reichlin (2008) who study the case of Bayesian regressions. They show that with increasing size of the model one should shrink more to avoid overfitting, but when data are collinear one is still able to extract the relevant sample information. We apply this principle in the case of VARs. We compare the large model with smaller systems in terms of forecasting performance and structural analysis of the effect of monetary policy shock. The results show that a standard Bayesian VAR model is an appropriate tool for large panels of data once the degree of shrinkage is set in relation to the model size. <p><p>The fourth chapter entitled “Forecasting euro area inflation with wavelets: extracting information from real activity and money at different scales” proposes a framework for exploiting relationships between variables at different frequency bands in the context of forecasting. This work is motivated by the on-going debate whether money provides a reliable signal for the future price developments. The empirical evidence on the leading role of money for inflation in an out-of-sample forecast framework is not very strong, see e.g. Lenza (2006) or Fisher, Lenza, Pill and Reichlin (2008). At the same time, e.g. Gerlach (2003) or Assenmacher-Wesche and Gerlach (2007, 2008) argue that money and output could affect prices at different frequencies, however their analysis is performed in-sample. In this Chapter, it is investigated empirically which frequency bands and for which variables are the most relevant for the out-of-sample forecast of inflation when the information from prices, money and real activity is considered. To extract different frequency components from a series a wavelet transform is applied. It provides a simple and intuitive framework for band-pass filtering and allows a decomposition of series into different frequency bands. Its application in the multivariate out-of-sample forecast is novel in the literature. The results indicate that, indeed, different scales of money, prices and GDP can be relevant for the inflation forecast.<p> / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
|
Page generated in 0.1161 seconds