• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 492
  • 34
  • 24
  • 22
  • 21
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
421

Macroeconomic forecasting using model averaging

Verra, Christina January 2009 (has links)
Recently, there has been a broadening concern on forecasting techniques that are applied on large data sets, since economists in business and management want to deal with the great magnitude of information. In this analysis, the issue of forecasting a large data set by using different model averaging approaches is addressed. In particular, Bayesian and frequentist model averaging methods are considered, including Bayesian model averaging (BMA), information theoretic model averaging (ITMA) and predictive likelihood model averaging (PLMA). The predictive performance of each scheme is compared with the most promising existing alternatives, namely benchmark AR model and the equal weighted model averaging (AV) scheme. An empirical application on Inflation forecasting for five countries using large data sets within the model averaging framework is applied. The average ARX model with weights constructed differently according to each model averaging scheme is compared with both the benchmark AR and the AV model. For the comparison of the accuracy of forecasts several performance indicators have been provided such as the Root Mean Square Error (RMSE), the Mean Absolute Error (MAE), the U-Theil’s Inequality Coefficient (U), Mean Square Forecast Error (MSFE) and the Relative Mean Square Forecast Error (RMSFE). Next, within the Granger causality framework through the Diebold & Mariano (DM) test and the Clark & McCracken (CM) test, whether the data-rich models represented by the three different model averaging schemes have made a statistically significant improvement relative to the benchmark forecasts has been tested. Critical values at 5% and at 10% have been calculated based on bootstrap approximation of the finite sample distribution of the DM and CM test statistics. The main outcome is that although the information theoretic model averaging scheme is a more powerful approach, the other two model averaging techniques can be regarded as useful alternatives.
422

The role of the state in economic development : a case study of the GCC countries

Jamhour, Ali January 2012 (has links)
The aim of this thesis is to assess the role of the state in economic development in the GCC countries. Three aspects of this subject are investigated. An Indirect role of the state is analyzed through the effect of financial development on the economic growth. And direct roles of the state are examined through the impact of defence spending and that of public infrastructure on the development process. First the role of financial development is analyzed by using three alterative causality tests. The results suggest that the existence of long-run relationship between economic development and the state of financial development in most GCC countries. The results further suggest that financial sector can be a leading sector for some of the GCC countries. Second, the impact of defence spending on economic development is examined by employing VAR/ECM models. The emerging results suggest that defence expenditure appears to retard economic development for countries with relatively heavy defence expenditure (Saudi Arabia and UAE), whilst positive effect is suggested for GCC member with relatively low defence spending (Bahrain and Oman). Third, for one member of GCC (Saudi Arabia) the role of public infrastructure in economic development is analyzed also using VAR/ECM. The results indicate that the high public capital expenditure in Saudi Arabia has insignificant effect in the economic development in the long – run.
423

Optimal plan design and dynamic asset allocation of defined contribution pension plans : lessons from behavioural finance and non-expected utility theories

Zhang, Yumeng January 2009 (has links)
The question of optimal asset allocation strategy for defined contribution (DC) pension plans is addressed. A primary motivation for this study is provided by the recent literature on behavioural finance and intertemporal life-cycle investment theory. In this thesis two alternative utility forms are considered: loss aversion and Epstein-Zin recursive utility. We develop a dynamic-programming-based numerical model with uninsurable stochastic labour income and borrowing constraints. In the loss aversion case, members are assumed to be loss averse with a target replacement ratio at retirement and a series of suitably defined interim target prior to retirement. We also extend the intertemporal life-cycle saving and investment theory to the dynamic asset allocation problem of DC pension schemes. A new approach to model contribution and investment decisions with focus on the member’s desired pattern of consumption over the lifetime (based on Epstein-Zin utility preference) is proposed. The thesis draws on empirical evidence of salary scales and loss aversion parameters from UK households, with labour income progress estimated from the New Earnings Survey and loss aversion parameters estimated on the basis of face-to-face interviews with 966 randomly selected UK residents.
424

Non parametric estimation of high-frequency volatility and correlation dynamics

Mattiussi, Vanessa January 2010 (has links)
This thesis addresses the problem of quantitatively evaluating the temporal dynamics that characterized financial time series. In particular, we perform an accurate analysis of the Fourier estimator, a newly proposed nonparametric methodology to measure ex-post volatility and cross-volatilities as functions of time, when financial assets are observed at different highfrequency levels over the day. The estimator has the peculiar feature to employ the observed data in their original form, therefore exploiting all the available information in the sample. We first show how to considerably improve the numerical performance of the Fourier method making possible the analysis of large sets of data, as it is usually the case with high-frequency series. Secondly, we use Monte Carlo simulation methods to study the behavior of three driving parameters in the estimation procedure, when the effects of both irregular sampling and microstructure noise are taken into account. The estimator is showed to be particularly sensitive to one of these quantities, which is in turn used to control the contribution of the above effects. Integrated financial correlation is also analyzed within two distinct comparative studies that involve other multivariate measures. The analysis is then extended to consider the entire evolution of the underlying correlation process. Finally, we propose a new class of nonparametric spot volatility estimators, which is showed to include the Fourier method as a particular case. The full limit theory under infill asymptotics in the pure diffusive settings of the class is derived. Empirical evidence in support of our conclusions is also provided.
425

Essays on generalised empirical likelihood

Parente, Paulo Miguel Dias Costa January 2007 (has links)
The aim of this thesis is to investigate Generalised Empirical Likelihood (GEL) and related informational theoretical methods for estimation and inference of parameters of models that satisfy unconditional and conditional moment restrictions. Three topics in this field are studied. Firstly, the first order asymptotic theory of the GEL class of estimators for the parameters of non-smooth moment restrictions is analysed and test statistics in this framework are introduced. It is shown that, in random samples, all the estimators in the GEL class have the same asymptotic distribution of the Generalised Method of Moments (GMM) estimator in this set-up under the same assumptions. The test statistics proposed are particularly useful to perform inference in quantile regression models as they to not require the estimation of the asymptotic covariance matrix of the estimator. Secondly, definitions of exogeneity in models defined by conditional moment restrictions are introduced and test statistics for this hypothesis are proposed based on the GMM and GEL estimators. These tests are based on the equivalence between a finite number of conditional moment restrictions and a countably infinite number of unconditional restrictions. These definitions are important when the researcher is interested in estimating the parameters of functions that satisfy conditional moment restrictions as in the case of mean regression. Lastly, encompassing tests are introduced to compare a model defined by conditional moment restrictions with a parametric model. Researchers usually use parametric models as they are easier to apply, even though the assumptions of the semiparametric model may be more reasonable. Test statistics that check if models defined by conditional moment restrictions can be explained by parametric models are introduced. Tests statistics that examine if the former explain the latter are also proposed. A by-product of this analysis is the formal derivation of the asymptotic distribution of the conditional empirical likelihood estimator under misspecification.
426

Essays in time series analysis : modelling stochastic volatility and forecast evaluation

Malik, Sheheryar January 2009 (has links)
No description available.
427

Competition in the history of economic thought

Dennis, Kenneth G. January 1975 (has links)
The word, competition, entered economic discourse slowly and naturally, over many decades and even centuries, as a term of common usage. From that point of view, I trace the development of the word, as it underwent various conceptual modifications en route to the more technical concept of perfect competition. The growing divergence between the ordinary meaning and the technical concept of competition represents one of the many gauges of the progress of economics towards becoming a rigorous science, but that divergence has also brought with it a number of problems. Thus, the purpose of my thesis is two-fold: it is both a scholarly and a didactic one. As for scholarship, I set forth a textually accurate account of how the word competition has figured in the early rise and the more mature consolidation of economics as a special discipline, in which the abstract idea of a "perfect" type of market competition was employed by economists as an heuristic fiction, to bring together all the separate elements of what Edgeworth has so nicely termed the Economic Calculus. As for didactics, my thesis also serves as an exercise in identifying several root problems stemming from the concept of competition, both in its common and technical forms. Suggestions are made as to how some of the perplexities of present-day economic theory, based on the notion of perfect competition, can be resolved. Key Themes:- In Chapter I, four major themes are announced and subjected to semantic analysis. Eunning throughout the remainder of the history, they refer to the dualities, ambiguities and qualities of vagueness and imprecision which pertain to the common-sense or ordinary notion of competitive conduct, understood as the striving of two or more persons against one another for the same object. Upon careful study, competition is readily seen to be both equilibrating and disequilibrating in its tendencies, comprising both innovative and adaptive patterns of behaviour, and is both freeing and constraining in its effects, entailing as it does not only deliberate and conscious striving (or the exercise of free will) but as well the clash of opposing interests, or contention between two or more persons, Themes 1 and II. As a consequence of the foregoing dualities, the word itself holds out emotive connotations that are both positive and negative (Theme III), giving the word a character of ambivalence which is inherent from its very root meaning. Thus, in its role as a principle of economic thought, competition has elicited widely, and sometimes wildly, varying responses from those who have contemplated its workings and have passed judgement upon its status. Finally, because of its very abstraction or generality of meaning, the common-sense idea of competition is imprecise and open-ended, insofar as its general meaning specifies nothing about (a) the nature of the objectives of competitive pursuit, (b) the Betting in which competitive striving takes place, (c) the participants and their grouping into competitive "units," and (d) the strategies and patterns of conduct followed - Theme IV. Critical Episodes:- The historical narrative proceeds, for the most part, in a chronological sequence, and is organized around a series of decisive moments, when crucial turning-points are reached and passed, separating distinct traditions of thought, or else marking the subtle transition from one mode of reasoning to another. After a brief preliminary survey of the medieval and early modern scholastic literature, the main historical narrative begins essentially with an account of the "classical" mercantilist literature of the first half of the 17th century, with its emphasis upon national rivalries and the so-called balance-of-trade doctrine. With these preparatory steps taken, the remainder of the historical content of the thesis can be summarized in terms of the following critical episodes:- 1. l670s-1700. The beginnings of the breakdown of the classical mercantilist doctrine is accompanied by subtle shift away from the emphasis upon national rivalry towards a more individualized and sectionalized understanding of competitive interdependence, in which the negative overtones of (national) conflict in market exchange are gradually replaced by a more positive attitude towards competition as a stimulus to economic progress and efficiency. 2. 1750s-1776. The final breakdown of the mercantile logic, as the dominant influence over economic thought, occurs with the sudden and dramatic rise and momentary ascendancy of the physiocratio school, a decisive confrontation taking place between Forbonnais, latter-day exponent of "liberal mercantilism," and Quesnay, Baudeau and others of the new school (1766-68). As a principle of harmony in the physiocratic doctrine, market competition is depicted as an exchange of equivalents between individuals, and hence a limited but beneficial force in economic affairs.Adam Smith, sharing the physiocrats' liberal attitude and individualized concept of competition, adapts and improves their economic calculus, giving to competition a more positive and directive role, by showing how it regulates and thus facilitates market exchange, which in turn allows for a more productive employment of resources. 3. 1815-1848. Building upon Smith's foundations of economic liberalism, the classical economists make important conceptual and analytical advances in regard to the economic calculus, such as in enunciating the law of diminishing returns, but do rather little to alter Smith's treatment of competition. A crisis in classical liberalism is soon reached (c. 1825), in facing the newly emergent tradition of socialist thought, whereby there ensues a fierce clash of opinion concerning such things as the freeing versus constraining character of competition. As a result, the behavioural process of competition tends to become associated with a particular set of economic institutions (property, contract, markets, and market exchange). The debate points to the need to distinguish between primary and secondary income distribution (paralleling that of "class" and "individual") and the need to clarify the nature of competitive grouping. 4. 1866-71. After much delay, there occurs a transition from the classical modes of verbal reasoning towards the neoclassical styles and methods of mathematical reasoning, a transition which was hastened by Thornton's sharp attack upon competition as a law-like principle (1866) and by Jevons's response to the growing need to improve the classical value theory. However, the substance of classical doctrine is retained in neoclassical theory, as is shown by the instrumental role played by "perfect" competition, as an heuristic fiction used in the building of the neoclassical calculus. This source of continuity is qualified, though, to the extent that the classical principle of competition is supported less and less by a direct intuitive appeal to empirical evidence and more and more by the resort to abstraction and the logical rigour of mathematical theory. 5. 1889-91. After a decade or more of rapid progress and genuine improvement and refinement of mathematical technique of analysis, neoclassical theory reaches a state of crisis, brief in duration but far-reaching in its implications, when Edgeworth's static approach to equilibrium analysis departs from the Valrasian dynamics of tatonnement, faintly signalling the onset of a new outlook as to the nature and purpose of abstract economics.
428

The educational and labour market expectations of adolescents and young adults

Jerrim, John January 2011 (has links)
Understanding why some suitably qualified young adults go on to enter higher education and others do not has been the subject of extensive research by a number of social scientists from a range of disciplines. Economists suggest that young adults’ willingness to invest in a tertiary qualification depends upon what they believe the costs and benefits of this investment will be. On the other hand, sociologists stress that an early expectation of completing university is a key driver of later participation in higher education. Children's subjective beliefs of the future (their “expectations”) are a consistent theme within these distinctively different approaches. Researchers from both disciplines might argue that children's low or mistaken expectations (of future income, financial returns, their ability to complete university) might lead them into making inappropriate educational choices. For instance, young adults who do not have a proper understanding of the graduate labour market may mistakenly invest (or not invest) in tertiary education. Alternatively some academically talented children may not enter university if they do not see it as realistic possibility, or that it is 'not for the likes of them'. I take an interdisciplinary approach within this thesis to tackle both of these issues. Specifically, I investigate whether young adults have realistic expectations about their future in the labour market and if disadvantaged children scoring high marks on a maths assessment at age 15 believe they can complete university.
429

Aggregate and disaggregated fluctuations

Mennuni, Alessandro January 2011 (has links)
In the usual version of the neoclassical growth model used to identify neutral (N-Shock) and investment shocks (I-Shock), a linear transformation frontier between consumption and investment goods is assumed. This paper extends the original framework, allowing for curvature in the transformation frontier, and studies how this affects the relative price of investment goods and hence the identification of investment shocks. A concave frontier allows a substantial improvement in the prediction of the saving rate. Furthermore, a concave frontier induces short-run aggregate effects of relative demand shifts, thereby fostering the propagation of the shocks under consideration, which overall account for 86% of the aggregate fluctuations. When I identify shocks with curvature, the N-shock appears to be stationary while the I-shock is a unit root. This leads the N-shock to play a major role: 91% of the fluctuations explained are due to the N- shock
430

Some aspects of empirical likelihood

Wang, Xing January 2008 (has links)
Chapter 1 is a non technical introduction to the thesis. In chapter 2, Basics of Large Deviation Theory, we illustrate the basic idea of large devi- ation theory and brie‡y review the history of its development. As a preparation, some of the important theorems which we will employ in the following chapters are also introduced. In chapter 3, Asymptotic Optimality of Empirical Likelihood Tests With Weakly Dependent Data, we extend the result of Kitamura (2001) to stationary mixing data. The key thing in proving the large deviation optimality is that the empirical measure of the independently and identically distributed data will obey the large deviation principal (LDP) with rate function equal to the relative entropy, but in general the large deviation performance of empirical measure of dependent data is complicated. In this chapter we add S-mixing condition to the stationary process and we show that the rate function of the LDP of S-mixing process is indeed equal to the relative entropy, and then asymptotic optimality follows from the large deviation inequality. In chapter 4, Large Deviations of Empirical Likelihood with Nuisance Parameters, we discuss the asymptotic e¢ ciency of empirical likelihood in the presence of nuisance parameters combined with augmented moment conditions. We show that in the presence of nuisance parameters, the asymptotic e¢ ciency of the empirical likelihood estimator of the parameter of interest will increase by adding more moment conditions, in the sense of the positive semide…niteness of the di¤erence of information matrices. As a by-product, we point out a necessary condition for the asymptotic e¢ ciency to be increased when more moment condition are added. We also derive asymptotic lower bounds of the minimax risk functions for the estimator of the parameter of interest, and we show that the empirical likelihood estimator can achieve this bound. In chapter 5, Empirical Likelihood Estimation of Auction Models via Simulated Moment Conditions, we apply empirical likelihood estimation to the simplest …rst-price sealed bid auc- tion model with independent private values. Through estimation of the parameter in the distri- bution function of bidders’private values we consider a potential problem in the EL inference when the moment condition is not in an explicit form and hard to compute, or even not con- tinuous in the parameter of interest. We deal with this issue following the method of simulated moment through importance sampling. We demonstrate the convergence of the empirical likeli- hood estimator from the simulated moment condition, and found that the asymptotic variance is larger than usual which is disturbed by simulation

Page generated in 0.0989 seconds