• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 492
  • 34
  • 24
  • 22
  • 21
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

A study of the social and economic thought of J.A. Hobson

Lee, Alan John Frank January 1970 (has links)
No description available.
262

The development of American institutional economics

Rutherford, Malcolm January 1979 (has links)
Institutional economics is a particularly ill-defined concept, and a great deal of disagreement surrounds its meaning. Both the nature and development of institutional economics have been the subject of dispute for some sixty years; even those who claim to be institutionalists do not always agree on these issues. This thesis is an examination of the development and nature of American institutionalism. It proceeds through a detailed study of the intellectual currents in nineteenth century America which gave rise to the movement, and the work of those writers generally accepted as institutionalists. Most attention is given to T. Veblen, W.H. Hamilton, W.C. Mitchell, J.R. Commons, R.G. Tugwell, and C.E. Ayres. It is argued that institutionalism grew out of the impact of evolutionism and historicism in American thought. These factors resulted in the development of the "new school” of German influenced scholars, the work of Thorstein Ueblen, and the rise of pragmatism. Institutionalism is a combination of Ueblenism, pragmatism, and the ideas of new school writers such as R.T. Ely and H.C. Adams. The examination of the work of the major institutionalists reveals that while they do share a core of very general methodological and economic views, there are a number of points of significant variation. It is also noticeable that the economic theories that institutionalism contains are not rigorously developed and contain many weaknesses. The thesis contends that institutionalism can best be seen as a broad movement containing within itself a number of distinguishable “wings,” “groups,” or traditions.” Its failure to develop a greater degree of coherence and more satisfactory theoretical ideas is attributed to the problems inherent in the epistemological and methodological positions adopted by its members.
263

Preferences as a determinant of the optimal level of decentralisation in health care resource allocation : theoretical insights and an empirical application

Quintal, Carlota Maria Miranda January 2006 (has links)
From an economic point of view, decentralisation is expected to increase social welfare through better matching of service delivery to preferences. The latter have been a central piece of the economic rationales for decentralisation but only indirectly. Thus, at the theoretical level, the main question addressed in this dissertation is: might preferences in themselves influence the impact of decentralisation on allocative efficiency, in the context of health care resource allocation? Regardless of which model (public choice theory or principal-agent theory) is used to explain the positive outcome above mentioned, in any case, the benefits generated by decentralisation depend on the assumption of variation in preferences across jurisdictions. However, there is little empirical evidence regarding this matter. Consequently, at the empirical level, the main question addressed in the current work is: does geographic variation in preferences, in the context of health care resource allocation, exist? To answer this question we developed and administered the same questionnaire (eliciting preferences) to two independent samples drawn from two Portuguese municipalities. Within our framework, central and local decision-makers are seen as alternative agents acting on behalf of local populations. Given the different capabilities possessed by agents, decentralisation of resource allocation generates some trade-offs between objectives. Depending on the trade-offs that local populations are willing to make, they will be better-off with one or the other agent. Therefore, we conclude that the specific preferences held by individuals might also determine in themselves whether or not decentralisation is optimal, when compared to centralisation. Concerning the empirical work, the principal conclusion is that the results do not corroborate the hypothesis of geographic variation in preferences, meaning that the theoretical discussion about the impact of decentralisation on allocative efficiency might have to be revisited. The empirical results further suggest that the geographical dimension of (in)equality in treatment matters to people and that a maximum opportunity cost of equality, in terms of health gain foregone, is likely to exist.
264

Essays on non-linear aggregation in macroeconomics

Moretti, Gianluca January 2006 (has links)
In this PhD thesis I investigate the implications of heterogeneity and aggregation in macroeconomic models. The importance of aggregation lies ~n the fact that when heterogeneity is allowed, we cannot expect macro models to have the same characteristics as the underlying micro models. In particular, a direct consequence of aggregation is that the dynamic properties of the micro model do not hold in general for the macr,? model. Despite this problem, modern macroeconomics tends to model aggregate data alone, through the construction of models where the individual consumer or firm is related to aggregate data under the guise of a 'representative agent'. In this thesis, I present a heterogeneous real business cycle model where I allow for cross sectional heterogeneity in the dynamics of the firm productivities. I show that heterogeneity allows the model to generate very persistent dynamics that can mimic impressively those of actual data. This is because, the dynamics of the model are now the result of the interactions between heterogeneous firms. Another problem that often arises with heterogeneity is that through aggregation, the dynamics that describe the co-movements between two variables can be more persistent and complex than the dynamics observed for the individual behaviour. Standard co-integration techniques are not able to deal with such persistent co-movements since they cannot distinguish between persistent deviations from the equilibrium and spurious relations. Therefore, many intuitive economic relations are often empirically rejected. To this purpose, I introduce in the thesis a methodology which can test robustly for co-integration between two variables, which deviate persistently from their long-run equilibrium. I test for a co-integration in the Uncovered Interest Parity and the Purchasing Power Parity with my approach and, unlike the standard approaches, it does not reject the hypothesis that they hold in the long run.
265

Essays in applied and psychological game theory : co-operation, corruption, and political economy

Balafoutas, Loukas January 2009 (has links)
The first chapter of the thesis applies game theory in order to examine the question of income redistribution from a new angle. In particular, it considers a mechanism of patron-client relationships which enables the rich class to limit the extent of redistributive taxation. In effect, the aim of patronage is to “buy” the votes of some poor citizens and lower the demand for redistribution. Income tax rates are further shown to depend negatively on government corruption in the form of fund capture, provided that a democratic regime is in place and the government cares about re-election. This link is tested empirically using cross country data and the evidence is consistent with the predictions of the model. The second and third chapters shift the focus of attention towards the process of decision making in games and the role of emotions in this process. Corruption is considered in detail as the outcome of a co-operation game between two players, with a third player (or “third party”) having a stake in the outcome of the game but no opportunity to take any direct action. This situation is analysed using psychological game theory. Players’ utility functions are extended to include beliefs and the emotions that these generate. In the theoretical model that makes up the second chapter the emotion of interest is guilt and it is conditioned on the beliefs of the third party. The two players are then less likely to collude if they believe that the third party expects a favourable outcome for herself. The model solves for the conditions under which collusion emerges in equilibrium. The main assumption of the model (i.e. the role of belief in decision making) as well as some of its predictions is then tested using an economic experiment in the third and final chapter of the thesis. The experimental findings strongly support the impact of beliefs on the incidence of collusion: the perceived expectations of the third party about the outcome of the game appear to be the most significant factor that determines the outcome itself.
266

Developing retail performance measurement and financial distress prediction systems by using credit scoring techniques

Hu, Yu-Chiang January 2007 (has links)
The current research develops a theoretical framework based on the Resource-Advantage Theory of Competition (Hunt, 2000) for the selection of appropriate variables. Using a review of the literature as well as interviews and a survey, 170 potential retail performance variables were identified as possible for inclusion in the model. To produce a relatively simple model with the aim of avoiding over-fitting a limited number of key variables or principal components were selected to predict default. Five credit-scoring techniques: Naïve Bayes, Logistic Regression, Recursive Partitioning, Artificial Neural Network and Sequential Minimal Optimization (SMO) were employed on a sample of 195 healthy and 51 distressed businesses from the USA over five periods: 1994-1998, 1995-1999, 1996-2000, 1997-2001 and 1998-2002. Analyses provide sufficient evidence that the five credit scoring methodologies have sound classification ability in the year before financial distress. However it is difficult to conclude which modelling technique has the highest classification ability uniformly, since model performance varied in terms of different time scales. The analysis also showed that external environmental influence does impact on default assessment for all five credit-scoring techniques, but these influences are weak. These findings indicate that the developed models are theoretically sound. There is however, a need to compare their performance to other approaches. First, rankings from the study were compared with those from a standard rating system – the well established Moody’s Credit Rating. It is assumed that the higher the degree of similarity between the two sets of rankings, the greater the credibility of the prediction model. The results indicated that the logistic regression model and the SMO model were the most comparable with Moody’s. Secondly, the model’s performance was assessed by applying it to different geographical areas. The USA model was therefore applied to European and Japanese markets. Results indicated that all market models displayed similar discriminating ability one year prior to financial distress. However, the USA model performed relatively better than European and Japanese models five years before financial distress. This implied that a financial distress model has potentially better prediction ability when based on a single market. It was decided to explore the performance of a generic global model, since model construction is time-consuming and costly. A composite model was constructed by combining data from USA, European and Japanese markets. This composite model had sound prediction performance, even up to five years before financial distress, as the accuracy rate was above 85% and AUROC value was above 0.72. Comparing with the original USA model, the composite model has similar prediction performance in terms of the accuracy rate. However, the composite model presented a worse prediction utility based on the AUROC value.
267

Three essays on bounded rationality and individual learning in repeated games

Whitehead, Duncan January 2009 (has links)
In the first chapter, we revisit the El Farol bar problem developed by Brian W. Arthur (1994) to investigate how one might best model bounded rationality in economics. We begin by modelling the El Farol bar problem as a market entry game and describing its Nash equilibria. Then, assuming agents are boundedly rational in accordance with a reinforcement learning model, we analyse long-run behaviour in the repeated game. In a single population of individuals playing the El Farol game, reinforcement learning predicts that the population is eventually subdivided into two distinct groups: those who invariably go to the bar and those who almost never do. We demonstrate that reinforcement learning predicts sorting in the El Farol bar problem. The second chapter considers the long-run behaviour of agents learning in finite population games with random matching. In particular we study finite population games composed of anti-coordination pair games. We find the set of conditions for the payoff matrix of the two-player game that ensures the existence of strict pure strategy equilibria in the finite population game. Furthermore, we suggest that if the population is sufficiently large and the two-player pair games meet certain criteria, then the long-run behaviour of individuals, learning in accordance with the Erev and Roth (1998) reinforcement model, asymptotically converges to pure strategy profiles of the population game. The third chapter investigates some of the theoretical predictions of learning theory in anti-coordination finite population games with random matching through laboratory experiments in economics. Previous data from experiments on anti-coordination games has focused on aggregate behaviour and has evidence that outcomes mimic the mixed strategy equilibrium. Here we show that in finite population anti-coordination games, reinforcement learning predicts sorting; that is, in the long run, agents play pure strategy equilibria where subsets of the population permanently play each available action.
268

Adam Smith's sociological economics

Reisman, David A. January 1972 (has links)
The study argues that Adam Smith sought to use economic forces to bring about social change. Consumer goods yield utility not in themselves but as symbols in the process of social interaction; and thus it would be meaningless to advocate economic growth to increase the supply of these symbols, particularly if economic growth altered the social structure and changed the nature of the symbols needed. Nor can growth be attributed to instinct end man's character, since character was the result of economic change as well as the cause of it. So were norms and values; having dismissed revealed religion, natural law, and an absolute standard of ethics. Smith had no choice but to approximate morality to aesthetics and science, explaining all three in terms of the propriety of habitual associations. Moreover, this perception was to be sensory, not rational or intellectual. The mind is the prisoner of the body, and the body of the situation. The only way to make such circular causation into meaningful teleology is to introduce an outside factor. To Smith this was institutional change. Economic growth was welcomed as it would liberalise the state, reduce the temporal power of aristocracy and clergy, combat "superstition" with science, encourage learning and humanitarienism, and raise, the living standards of the masses by improving their bargaining power. All of these goal could have been attained by political revolution; but Smith, fearing violence and favouring social continuity, preferred the compromise that economic and social revolution represented.
269

Measuring capabilities : an empirical investigation of the Sen-Nussbaum approach to well-being

Hunter, William Mitchell Graham January 2010 (has links)
This thesis argues for the view that Amartya Sen's capabilities approach is a preferable approach to the measurement of welfare by addressing three questions: Can the capability approach be operationalised? What is the relationship between capabilities and satisfaction with life? How do capabilities respond to changes over time? Chapter 1 provides a discussion of a widely used economic evaluation model of welfare focussing on some of its key problems and concludes with a discussion of Sen's alternative capabilities approach. Chapter 2 discusses the three key relationships that Sen uses in evaluating wellbeing and discusses the identification of capabilities based on the account developed by Martha Nussbaum.
270

Stochastic investment models for actuarial use in the UK

Sahin, Sule January 2010 (has links)
The objective of this thesis is to construct a stochastic term structure model for actuarial use in the UK. The starting point of this study is the Wilkie investment model (1995). We review the Wilkie model by updating the data and re-estimating the parameters. Then, we focus on the interest rate part of the model and construct a model for the entire term structure. We model the UK nominal spot rates, real spot rates and implied inflation spot rates considering the linkage between their term structures and some macroeconomic variables, in particular, realised inflation and output gap. We fit a descriptive yield curve model proposed by Cairns (1998) to fill the missing values in the yield curve data provided by the Bank of England by changing the fixed parameters (exponential rates) in the model to find the best set of parameters for each data set. Once the Cairns model is fitted to the UK yield curves we apply principal component analysis (PCA) to the fitted values to decrease the dimension of the data by extracting uncorrelated variables. Applying PCA to the fitted values we find three principal components which correspond roughly with ‘level’, ‘slope’ and ‘curvature’ for each yield curve. We explore the bi-directional relations between these principal components and the macroeconomic variables to construct ‘yield-only’ and ‘yield-macro’ models. We also compare the ‘yield-macro’ model with the Wilkie model.

Page generated in 0.1213 seconds