Spelling suggestions: "subject:"method off moments"" "subject:"method oof moments""
11 |
Arbetslöshetsförsäkringens finansiering : Hur påverkas arbetslöshetskassornas medlemsantal av en förhöjd grad av avgiftsfinansiering?Gajic, Ruzica, Söder, Isabelle January 2010 (has links)
Sedan årsskiftet 2006/2007 har antalet medlemmar i arbetslöshetskassorna minskat drastiskt. Under samma period har ett flertal reformer genomförts på arbetslöshetsförsäkringens område som bland annat resulterat i höjda medlemsavgifter för de flesta a-kassorna. Syftet med denna uppsats är att undersöka huruvida det över tid går att finna något samband mellan förändringar i medlemsantal och medlemsavgifter. För att undersöka detta måste man förutom avgifterna även ta hänsyn till andra variabler kopplade till arbetslöshetsförsäkringen. Dessa övriga variabler är grundbelopp, högsta dagpenning, ersättningsgrad och arbetslöshet. Vi formulerar en modell för sambandet mellan medlemsantal och dessa variabler och skattar denna genom metoden Generalized Method of Moments med hjälp av data från 2000-2009. Våra resultat visar i enlighet med teori och tidigare forskning på ett negativt samband mellan medlemsavgifter och antalet medlemmar i a-kassan. Detta samband visar sig vara starkt, särskilt på lång sikt. För att tydigare se hur avgiftsförändringar påverkar olika typer av individer i olika grad har vi även undersökt huruvida medlemsantalet i a-kassor kopplade till tjänstemanna- respektive arbetarförbund är olika känsliga för förändringar i avgiften. Våra resultat visar i kontrast till tidigare studier att a-kassorna kopplade till tjänstemannaförbunden (TCO och Saco) är mer känsliga för förändringar jämfört med arbetarförbunden (LO). Detta skapar anledning att tro att det finns andra faktorer än avgifter och de övriga variablerna som inkluderats i vår modell vilka påverkar anslutningsgraden och som kan förklara skillnaden mellan de olika grupperna.
|
12 |
Do Better Institutions Alleviate the Resource Curse? Evidence from a Dynamic Panel Approach.Malebogo Bakwena Unknown Date (has links)
Contrary to conventional theory, a growing body of evidence suggests that economies with abundant natural resources perform badly in terms of economic growth relative to their resource poor counterparts—the so-called resource curse hypothesis. However, this general hypothesis is not robust. It clearly fails to account for the differing experiences of resource abundant economies. For instance, the theory, applied generally, offers no explanation as to why economies like Botswana and Norway have exceptional growth while Saudi Arabia and Nigeria have stagnated. Prompted by these experiences, the thesis investigates the circumstances under which the curse is more or less likely to exist. In particular, the thesis finds evidence that the major reason for the diverging experiences is the differences in the quality of institutions across countries. The thesis tests the hypothesis that the effect of resources on growth is conditional on the type and quality of institutions, by further building on Boschini, Pettersson, and Roine’s (2007) and Mehlum, Moene, and Torvik’s (2006b) influential works on the role of institutions in mitigating the resource curse. Advances are made by: (a) using a panel of up to 53 countries with different levels of development, institutional quality and natural resource abundance over the period 1984-2003; (b) applying a two-step system Generalised Method of Moments (GMM) estimation that accounts for biases associated with omitted variables, endogeneity and unobserved heterogeneity that potentially affect existing cross-country Ordinary Least Squares (OLS) growth results; (c) supplementing results of the commonly used International Country Risk Guide (ICRG) institutional performance indicators with those of institutional design indicators–that is, highlighting the role of electoral rules and form of government; (d) using an institutional quality measure that is more related to financial institutions than just economic or political institutions; (e) using a resource abundance indicator that focuses on non-renewable resources alone rather than the ones commonly used in the literature that include renewable resources, which are inappropriate. The key hypothesis that natural resource economies are not destined to be cursed if they have good institutions is confirmed by the empirical results of the thesis. Specifically, the results suggest that (a) adopting a democratic regime is better than a non-democratic one, in terms of generating growth from resource abundance (b) the electoral rules that a country adopts matter, i.e. having a democratic proportional rather than a democratic majority regime increases the growth benefits of resource abundance (c) as far as the form of government adopted is concerned, a democratic parliamentary rather than a democratic presidential regime generates more economic growth from its abundant natural resource (d) a well functioning banking sector induces more (resource abundant generated) growth and capital accumulation. Therefore, the lessons for policy makers who struggle to overcome the impediments to economic development that potentially accompany the “curse of resource abundance” are the need to develop and maintain better institutions and adopt improved management strategies of the financial proceeds forthcoming from such abundance.
|
13 |
Do Better Institutions Alleviate the Resource Curse? Evidence from a Dynamic Panel Approach.Malebogo Bakwena Unknown Date (has links)
Contrary to conventional theory, a growing body of evidence suggests that economies with abundant natural resources perform badly in terms of economic growth relative to their resource poor counterparts—the so-called resource curse hypothesis. However, this general hypothesis is not robust. It clearly fails to account for the differing experiences of resource abundant economies. For instance, the theory, applied generally, offers no explanation as to why economies like Botswana and Norway have exceptional growth while Saudi Arabia and Nigeria have stagnated. Prompted by these experiences, the thesis investigates the circumstances under which the curse is more or less likely to exist. In particular, the thesis finds evidence that the major reason for the diverging experiences is the differences in the quality of institutions across countries. The thesis tests the hypothesis that the effect of resources on growth is conditional on the type and quality of institutions, by further building on Boschini, Pettersson, and Roine’s (2007) and Mehlum, Moene, and Torvik’s (2006b) influential works on the role of institutions in mitigating the resource curse. Advances are made by: (a) using a panel of up to 53 countries with different levels of development, institutional quality and natural resource abundance over the period 1984-2003; (b) applying a two-step system Generalised Method of Moments (GMM) estimation that accounts for biases associated with omitted variables, endogeneity and unobserved heterogeneity that potentially affect existing cross-country Ordinary Least Squares (OLS) growth results; (c) supplementing results of the commonly used International Country Risk Guide (ICRG) institutional performance indicators with those of institutional design indicators–that is, highlighting the role of electoral rules and form of government; (d) using an institutional quality measure that is more related to financial institutions than just economic or political institutions; (e) using a resource abundance indicator that focuses on non-renewable resources alone rather than the ones commonly used in the literature that include renewable resources, which are inappropriate. The key hypothesis that natural resource economies are not destined to be cursed if they have good institutions is confirmed by the empirical results of the thesis. Specifically, the results suggest that (a) adopting a democratic regime is better than a non-democratic one, in terms of generating growth from resource abundance (b) the electoral rules that a country adopts matter, i.e. having a democratic proportional rather than a democratic majority regime increases the growth benefits of resource abundance (c) as far as the form of government adopted is concerned, a democratic parliamentary rather than a democratic presidential regime generates more economic growth from its abundant natural resource (d) a well functioning banking sector induces more (resource abundant generated) growth and capital accumulation. Therefore, the lessons for policy makers who struggle to overcome the impediments to economic development that potentially accompany the “curse of resource abundance” are the need to develop and maintain better institutions and adopt improved management strategies of the financial proceeds forthcoming from such abundance.
|
14 |
Do Better Institutions Alleviate the Resource Curse? Evidence from a Dynamic Panel Approach.Malebogo Bakwena Unknown Date (has links)
Contrary to conventional theory, a growing body of evidence suggests that economies with abundant natural resources perform badly in terms of economic growth relative to their resource poor counterparts—the so-called resource curse hypothesis. However, this general hypothesis is not robust. It clearly fails to account for the differing experiences of resource abundant economies. For instance, the theory, applied generally, offers no explanation as to why economies like Botswana and Norway have exceptional growth while Saudi Arabia and Nigeria have stagnated. Prompted by these experiences, the thesis investigates the circumstances under which the curse is more or less likely to exist. In particular, the thesis finds evidence that the major reason for the diverging experiences is the differences in the quality of institutions across countries. The thesis tests the hypothesis that the effect of resources on growth is conditional on the type and quality of institutions, by further building on Boschini, Pettersson, and Roine’s (2007) and Mehlum, Moene, and Torvik’s (2006b) influential works on the role of institutions in mitigating the resource curse. Advances are made by: (a) using a panel of up to 53 countries with different levels of development, institutional quality and natural resource abundance over the period 1984-2003; (b) applying a two-step system Generalised Method of Moments (GMM) estimation that accounts for biases associated with omitted variables, endogeneity and unobserved heterogeneity that potentially affect existing cross-country Ordinary Least Squares (OLS) growth results; (c) supplementing results of the commonly used International Country Risk Guide (ICRG) institutional performance indicators with those of institutional design indicators–that is, highlighting the role of electoral rules and form of government; (d) using an institutional quality measure that is more related to financial institutions than just economic or political institutions; (e) using a resource abundance indicator that focuses on non-renewable resources alone rather than the ones commonly used in the literature that include renewable resources, which are inappropriate. The key hypothesis that natural resource economies are not destined to be cursed if they have good institutions is confirmed by the empirical results of the thesis. Specifically, the results suggest that (a) adopting a democratic regime is better than a non-democratic one, in terms of generating growth from resource abundance (b) the electoral rules that a country adopts matter, i.e. having a democratic proportional rather than a democratic majority regime increases the growth benefits of resource abundance (c) as far as the form of government adopted is concerned, a democratic parliamentary rather than a democratic presidential regime generates more economic growth from its abundant natural resource (d) a well functioning banking sector induces more (resource abundant generated) growth and capital accumulation. Therefore, the lessons for policy makers who struggle to overcome the impediments to economic development that potentially accompany the “curse of resource abundance” are the need to develop and maintain better institutions and adopt improved management strategies of the financial proceeds forthcoming from such abundance.
|
15 |
Correlated GMM Logistic Regression Models with Time-Dependent Covariates and Valid Estimating EquationsJanuary 2012 (has links)
abstract: When analyzing longitudinal data it is essential to account both for the correlation inherent from the repeated measures of the responses as well as the correlation realized on account of the feedback created between the responses at a particular time and the predictors at other times. A generalized method of moments (GMM) for estimating the coefficients in longitudinal data is presented. The appropriate and valid estimating equations associated with the time-dependent covariates are identified, thus providing substantial gains in efficiency over generalized estimating equations (GEE) with the independent working correlation. Identifying the estimating equations for computation is of utmost importance. This paper provides a technique for identifying the relevant estimating equations through a general method of moments. I develop an approach that makes use of all the valid estimating equations necessary with each time-dependent and time-independent covariate. Moreover, my approach does not assume that feedback is always present over time, or present at the same degree. I fit the GMM correlated logistic regression model in SAS with PROC IML. I examine two datasets for illustrative purposes. I look at rehospitalization in a Medicare database. I revisit data regarding the relationship between the body mass index and future morbidity among children in the Philippines. These datasets allow us to compare my results with some earlier methods of analyses. / Dissertation/Thesis / Arizona Medicare Data on Rehospitalization / Philippine Data on Children's Morbidity / M.S. Statistics 2012
|
16 |
Three Essays on Correlated Binary Outcomes: Detection and Appropriate ModelsJanuary 2018 (has links)
abstract: Correlation is common in many types of data, including those collected through longitudinal studies or in a hierarchical structure. In the case of clustering, or repeated measurements, there is inherent correlation between observations within the same group, or between observations obtained on the same subject. Longitudinal studies also introduce association between the covariates and the outcomes across time. When multiple outcomes are of interest, association may exist between the various models. These correlations can lead to issues in model fitting and inference if not properly accounted for. This dissertation presents three papers discussing appropriate methods to properly consider different types of association. The first paper introduces an ANOVA based measure of intraclass correlation for three level hierarchical data with binary outcomes, and corresponding properties. This measure is useful for evaluating when the correlation due to clustering warrants a more complex model. This measure is used to investigate AIDS knowledge in a clustered study conducted in Bangladesh. The second paper develops the Partitioned generalized method of moments (Partitioned GMM) model for longitudinal studies. This model utilizes valid moment conditions to separately estimate the varying effects of each time-dependent covariate on the outcome over time using multiple coefficients. The model is fit to data from the National Longitudinal Study of Adolescent to Adult Health (Add Health) to investigate risk factors of childhood obesity. In the third paper, the Partitioned GMM model is extended to jointly estimate regression models for multiple outcomes of interest. Thus, this approach takes into account both the correlation between the multivariate outcomes, as well as the correlation due to time-dependency in longitudinal studies. The model utilizes an expanded weight matrix and objective function composed of valid moment conditions to simultaneously estimate optimal regression coefficients. This approach is applied to Add Health data to simultaneously study drivers of outcomes including smoking, social alcohol usage, and obesity in children. / Dissertation/Thesis / Doctoral Dissertation Statistics 2018
|
17 |
Efficient Methods for Prediction and Control in Partially Observable EnvironmentsHefny, Ahmed 01 April 2018 (has links)
State estimation and tracking (also known as filtering) is an integral part of any system performing inference in a partially observable environment, whether it is a robot that is gauging an environment through noisy sensors or a natural language processing system that is trying to model a sequence of characters without full knowledge of the syntactic or semantic state of the text. In this work, we develop a framework for constructing state estimators. The framework consists of a model class, referred to as predictive state models, and a learning algorithm, referred to as two-stage regression. Our framework is based on two key concepts: (1) predictive state: where our belief about the latent state of the environment is represented as a prediction of future observation features and (2) instrumental regression: where features of previous observations are used to remove sampling noise from future observation statistics, allowing for unbiased estimation of system dynamics. These two concepts allow us to develop efficient and tractable learning methods that reduce the unsupervised problem of learning an environment model to a supervised regression problem: first, a regressor is used to remove noise from future observation statistics. Then another regressor uses the denoised observation features to estimate the dynamics of the environment. We show that our proposed framework enjoys a number of theoretical and practical advantages over existing methods, and we demonstrate its efficacy in a prediction setting, where the task is to predict future observations, as well as a control setting, where the task is to optimize a control policy via reinforcement learning.
|
18 |
Parametric deconvolution for a common heteroscedastic caseRutikanga, Justin Ushize January 2016 (has links)
>Magister Scientiae - MSc / There exists an extensive statistics literature dealing with non-parametric deconvolution, the estimation of the underlying population probability density when sample values are subject to measurement errors. In parametric deconvolution, on the other hand, the data are known to be from a specific distribution. In this case the parameters of the distribution can be estimated by e.g. maximum likelihood. In realistic cases the measurement errors may be heteroscedastic and there may be unknown parameters associated with the distribution. The specific realistic case is investigated in which the measurement error standard deviation is proportional to the true sample values. In this case it is shown that the method of moment’s estimation is particularly simple. Estimation by maximum likelihood is computationally very expensive, since numerical integration needs to be performed for each data point, for each evaluation of the likelihood function. Method of moment’s estimation sometimes fails to give physically meaningful estimates. The origin of this problem lies in the large sampling variations of the third moment. Possible remedies are considered. Due to the fact that a convolution integral needed to be calculated for each data point, and that this has to be repeated for each iteration towards the solution, maximum likelihood computing cost is very high. New preliminary work suggests that saddle point approximations could sometimes be used for the convolution integrals. This allows much larger datasets to be dealt with. Application of the theory is illustrated with simulation and real data.
|
19 |
Parameter Estimation for the Beta DistributionOwen, Claire Elayne Bangerter 20 November 2008 (has links) (PDF)
The beta distribution is useful in modeling continuous random variables that lie between 0 and 1, such as proportions and percentages. The beta distribution takes on many different shapes and may be described by two shape parameters, alpha and beta, that can be difficult to estimate. Maximum likelihood and method of moments estimation are possible, though method of moments is much more straightforward. We examine both of these methods here, and compare them to three more proposed methods of parameter estimation: 1) a method used in the Program Evaluation and Review Technique (PERT), 2) a modification of the two-sided power distribution (TSP), and 3) a quantile estimator based on the first and third quartiles of the beta distribution. We find the quantile estimator performs as well as maximum likelihood and method of moments estimators for most beta distributions. The PERT and TSP estimators do well for a smaller subset of beta distributions, though they never outperform the maximum likelihood, method of moments, or quantile estimators. We apply these estimation techniques to two data sets to see how well they approximate real data from Major League Baseball (batting averages) and the U.S. Department of Energy (radiation exposure). We find the maximum likelihood, method of moments, and quantile estimators perform well with batting averages (sample size 160), and the method of moments and quantile estimators perform well with radiation exposure proportions (sample size 20). Maximum likelihood estimators would likely do fine with such a small sample size were it not for the iterative method needed to solve for alpha and beta, which is quite sensitive to starting values. The PERT and TSP estimators do more poorly in both situations. We conclude that in addition to maximum likelihood and method of moments estimation, our method of quantile estimation is efficient and accurate in estimating parameters of the beta distribution.
|
20 |
Parameter Estimation for the Lognormal DistributionGinos, Brenda Faith 13 November 2009 (has links) (PDF)
The lognormal distribution is useful in modeling continuous random variables which are greater than or equal to zero. Example scenarios in which the lognormal distribution is used include, among many others: in medicine, latent periods of infectious diseases; in environmental science, the distribution of particles, chemicals, and organisms in the environment; in linguistics, the number of letters per word and the number of words per sentence; and in economics, age of marriage, farm size, and income. The lognormal distribution is also useful in modeling data which would be considered normally distributed except for the fact that it may be more or less skewed (Limpert, Stahel, and Abbt 2001). Appropriately estimating the parameters of the lognormal distribution is vital for the study of these and other subjects. Depending on the values of its parameters, the lognormal distribution takes on various shapes, including a bell-curve similar to the normal distribution. This paper contains a simulation study concerning the effectiveness of various estimators for the parameters of the lognormal distribution. A comparison is made between such parameter estimators as Maximum Likelihood estimators, Method of Moments estimators, estimators by Serfling (2002), as well as estimators by Finney (1941). A simulation is conducted to determine which parameter estimators work better in various parameter combinations and sample sizes of the lognormal distribution. We find that the Maximum Likelihood and Finney estimators perform the best overall, with a preference given to Maximum Likelihood over the Finney estimators because of its vast simplicity. The Method of Moments estimators seem to perform best when σ is less than or equal to one, and the Serfling estimators are quite accurate in estimating μ but not σ in all regions studied. Finally, these parameter estimators are applied to a data set counting the number of words in each sentence for various documents, following which a review of each estimator's performance is conducted. Again, we find that the Maximum Likelihood estimators perform best for the given application, but that Serfling's estimators are preferred when outliers are present.
|
Page generated in 0.0733 seconds