• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 420
  • 245
  • 61
  • 39
  • 13
  • 11
  • 11
  • 11
  • 11
  • 11
  • 11
  • 6
  • 4
  • 3
  • 3
  • Tagged with
  • 1113
  • 396
  • 239
  • 163
  • 147
  • 119
  • 119
  • 107
  • 100
  • 90
  • 90
  • 80
  • 78
  • 77
  • 73
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Applications of spline functions in economics

Poirier, Dale J. January 1900 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1973. / Typescript. Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliography.
102

Robustness and information processing constraints in economic models

Lewis, Kurt Frederick. January 2007 (has links)
Thesis (Ph. D.)--University of Iowa, 2007. / Supervisor: Charles H. Whiteman. Includes bibliographical references (leaves 134-137).
103

Railroads, Their Regulation, and Its Effect on Efficiency and Competition

McKenzie, Taylor 06 September 2017 (has links)
Railroads have been subject to federal regulation since 1887. Due to the development of competing modes of transportation and changes in types of products being shipped, regulation began to impede efficiency and viability of firms, leading to partial deregulation of the industry in 1980. Partial deregulation allowed railroads to reduce costs, notably through mergers and line abandonment, which were aggressively pursued following deregulation and led to dramatic efficiency gains. However, concerns remain over increased consolidation, lack of competition in the industry, and the ability of firms to continue to realize efficiency gains. This dissertation investigates more recent developments in the rail industry with an eye towards regulation's effect and role. I begin with a study into the markups of price over marginal cost and elasticities of scale in the rail industry. Scale elasticities provide information on where firms are operating on their average cost curves, and markups provide a more theoretically appealing method of examining pricing behavior than the revenue-to-variable-cost measure currently used by regulators. I extend previously developed methods to identify markups and scales for each firm and in each year. I find prices well in excess of marginal cost, and evidence firms are operating near minimum efficient scale, indicating efficiency gains from deregulation may be fully realized. I then present a study that examines productivity changes in the rail industry and the role of technological change. I extend stochastic frontier frameworks to allow productivity and the state of technology to evolve flexibly through time and vary across firms. I find firms turn towards technological innovation to realize productivity gains when other channels previously offered by deregulation are not available. I finish with a study of allocative errors in the rail industry. I again extend a stochastic frontier model to include differences in production across firms and allow allocative errors to be correlated with competitive pressures. I find that incorporating flexibility into the description of firm production is crucial for obtaining unbiased estimates of allocative errors, overcapitalization is prevalent in the rate-regulated rail industry, and additional competition does not appear to reduce inefficiency. This dissertation includes unpublished co-authored material.
104

The economic benefits of visitor spending for local communities in Great Britain : an examination of the development, application and main findings of proportional multiplier analysis

Vaughan, David Roger January 1988 (has links)
No description available.
105

A post-Keynesian macroeconomic theory for equity markets in stock-flow consistent frameworks

Lopez Bernardo, Javier January 2015 (has links)
This thesis presents a theoretical framework for understanding the long-term behaviour of equity markets. The framework is informed by post-Keynesian theory. It highlights the importance of effective demand for equity valuation - alongside other post-Keynesian features such as a realistic institutional setup, the (in)efficiency of financial markets in pricing assets and the importance of income and wealth distribution for macroeconomic theory. In contrast to mainstream approaches dominated and constrained only by the logic of rational agents, a Stock-Flow Consistent (SFC) methodology is followed here. The strict accounting rules of SFC models guarantee that all assets, flows and price revaluations that happen in an economic system are booked accordingly, with no accounting 'black holes' in the logical structure. The SFC approach also permits an outcome in which the market value of assets differs from their book value, a crucial distinction that should be at the core of any theory for equity returns. This thesis makes a contribution to the post-Keynesian literature on the Cambridge corporate growth models. It is shown that this literature can be used as a starting point for developing a theory of equity markets with a more realistic institutional setup. The main features of the post-Keynesian theory for equity markets developed here can be summarised as follows. First, aggregate demand determines the return on shares and their valuation in the market. Second, Tobin's q is inversely related to the growth rate of the economy in the long-run and inversely related to the marginal propensities to consume. Third, Tobin's q can be different from 1 even in the long-run. And fourth, wealth holders' consumption decisions are a major driver of the equity yield in the long-run, a feature very similar in spirit to the Levy-Kalecki profit equation, but now applied to financial markets. I conclude that post-Keynesian theory can offer an alternative to mainstream finance and fill a gap in current financial macroeconomic theory.
106

CPU product line lifecycles: econometric duration analysis using parametric and non-parametric estimators

Fisher, Mischa 30 April 2018 (has links)
This thesis provides a comprehensive history of the statistical background and uses of survival analysis, and then applies econometric duration analysis to examine the lifecycles of product lines within the microprocessor industry. Using data from Stanford University's CPUDB, covering Intel and AMD processors introduced between 1971 and 2014, the duration analysis uses both parametric and nonparametric estimators to construct survival and hazard functions for estimated product line lifetimes within microprocessor product families. The well-known and widely applied non-parametric Kaplan-Meier estimator is applied on both the entire sample as a whole, and segmented estimate that considers product line lifecycles of Intel and AMD separately, with median survival time of 456 days. The parametric duration analysis uses both the semi-parametric Cox proportional hazard model, and the fully parametric accelerated failure time model across the Weibull, Exponential and Log-Logistic distributions, which find modest association between higher clock speed and transistor count on diminishing expected time in the marketplace for microprocessors, while the number of cores and other attributes have no predictive power over expected survival times. It is expected that the transistor count and clock speed of a given processor's negative effect on expected duration, likely captures the co-trending of growth in transistor count with a larger marketplace and broader product categories. / Graduate
107

Good's casualty for time series: a regime-switching framework

Mlambo, Farai Fredric January 2014 (has links)
Causal analysis is a significant role-playing field in the applied sciences such as statistics, econometrics, and technometrics. Particularly, probability-raising models have warranted significant research interest. Most of the discussions in this area are philosophical in nature. Contemporarily, the econometric causality theory, developed by C.J.W. Granger, is popular in practical, time series causal applications. While this type of causality technique has many strong features, it has serious limitations. The processes studied, in particular, should be stationary and causal relationships are restricted to be linear. However, we cannot classify regime-switching processes as linear and stationary. I.J. Good proposed a probabilistic, event-type explication of causality that circumvents some of the limitations of Granger’s methodology. This work uses the probability raising causality ideology, as postulated by Good, to propose some causal analysis methodology applicable in a stochastic, non-stationary domain. There is a proposal made for a Good’s causality test, by transforming the originally specified probabilistic causality theory from random events to a stochastic, regime-switching framework. The researcher performed methodological validation via causality simulations for a Markov, regime-switching model. The proposed test can be used to detect whether none stochastic process is causal to the observed behaviour of another, probabilistically. In particular, the regime-switch causality explication proposed herein is pivotal to the results articulated. This research also examines the power of the proposed test by using simulations, and outlines some steps that one may take in using the test in a practical setting.
108

Consistent testing for lag length in cointegrated relationships

Liu, Limin January 2007 (has links)
In the past few decades the theory of cointegration has been widely used in the empirical analysis of economic data. The reason is that, it captures the economic notion of a long-run economic relation. One of the problems experienced when applying cointegrated techniques to econometric modelling is the determination of lag lengths for the modelled variables. Applied studies have resulted in contradictory choices for lag length selection. This study reviews and compares some of the well-known information criteria using simulation techniques for bivariate models.
109

Essays in Econometrics:

Cooprider, Joseph January 2020 (has links)
Thesis advisor: Arthur Lewbel / In my doctoral research, I developed econometric estimators with strong applications in analysis of heterogeneous consumer demand. The first chapter develops an estimator for grouped patterns of heterogeneity in an approximately sparse setting. This setting is used to estimate demand shocks, competition sets and own-price elasticities for different groups of consumers. The second chapter, which is joint work with Stefan Hoderlein and Alexander Meister, develops a nonparametric estimator of the marginal effects in a panel data even if there are only a small number of time periods. This is used to estimate the heterogeneous marginal effects of increasing income on consumption of junk food. The third chapter, which is joint work with Stefan Hoderlein and Solvejg Wewal, is the first difference-in-differences model for binary choice outcome variables when treatment effects are heterogeneous. We apply this estimator to examine the heterogeneous effects of a soda tax. Chapter 1: ``Approximately Sparse Models and Methods with Grouped Patterns of Heterogeneity with an Application to Consumer Demand" introduces post-Lasso methods to time-varying grouped patterns of heterogeneity in linear panel data models with heterogeneous coefficients. Group membership is left unrestricted and the model is approximately sparse, meaning the conditional expectation of the variables given the covariates can be well-approximated by a subset of the variables whose identities may be unknown. I estimate the parameters of the model using a “grouped fixed-effects” estimator that minimizes a post-Lasso least-squares criterion with respect to all possible groupings of the cross-sectional units. I provide conditions under which the estimator is consistent as both dimensions of the panel tend to infinity and provide inference methods. Under reasonable assumptions, applying this estimator to a consumer demand application allows me to partition consumers into groups, deal with price endogeneity without instrumental variables, estimate demand shocks, and identify compliments and substitutes for each group. I then use this estimator to estimate demand for soda by identifying different groups' competition sets as well as demand shocks using Homescan data. Chapter 2: In ``A Panel Data Estimator for the Distribution and Quantiles of Marginal Effects in Nonlinear Structural Models with an Application to the Demand for Junk Food", we propose a framework to estimate the distribution of marginal effects in a general class of structural models that allow for arbitrary smooth nonlinearities, high dimensional heterogeneity, and unrestricted correlation between the persistent components of this heterogeneity and all covariates. The main idea is to form a derivative dependent variable using two periods of the panel, and use differences in outcome variables of nearby subpopulations to obtain the distribution of marginal effects. We establish constructive nonparametric identification for the population of ``stayers" (Chamberlain 1982), and show generic non-identification for the ``movers". We propose natural semiparametric sample counterparts estimators, and establish that they achieve the optimal (minimax) rate. Moreover, we analyze their behavior through a Monte-Carlo study, and showcase the importance of allowing for nonlinearities and correlated heterogeneity through an application to demand for junk food. In this application, we establish profound differences in marginal income effects between poor and wealthy households, which may partially explain health issues faced by the less privileged population. Chapter 3: In ``A Binary Choice Difference-in-Differences Model with Heterogeneous Treatment Effects and an Application on Soda Taxes", we answer how should Differences-in-Differences be implemented when outcomes are binary and we expect heterogeneous effects. The scope for applications is clearly vast, including labor force participation, product purchase decisions, enrollment in health insurance and much more. However, assumptions necessary to measure heterogeneous effects in classic Difference-in-Difference models break down with a binary dependent variable. We propose a model with a nonparametric random coefficient formulation that allows for heterogeneous treatment effects with a binary dependent variable. We provide identification of the average treatment effect on the treated (ATT) along with identification of the joint distribution of the actual and counterfactual latent outcome variable in the treatment group which allows us to show the heterogenous treatment effects. We suggest an estimator for the treatment effects and evaluate its finite sample properties with the help of Monte Carlo simulations. We further provide extensions that allow for more flexible empirical applications, such as including covariates. We apply our estimator to analyze the effect of a soft drink tax on consumer's likelihood to consume soda and find heterogeneous effects. The tax reduced the likelihood of consumption for the most consumers but not for those who were most likely to be consuming previously. / Thesis (PhD) — Boston College, 2020. / Submitted to: Boston College. Graduate School of Arts and Sciences. / Discipline: Economics.
110

The assimilation of "best-practice" econometric technology /

Ferzandi, Jehanbux D. January 1984 (has links)
No description available.

Page generated in 0.0439 seconds