• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 27
  • 27
  • 10
  • 8
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

UNEMPLOYMENT INSURANCE IN LABOR SEARCH MODEL AND MONEY DEMAND

Tano, Gerard Ghislain 01 May 2012 (has links) (PDF)
Countries with unemployment insurance (UI) program can effectively conduct a labor market policy and observe the flow of unemployed-employed. But should we just hand UI over to anyone who has no job? Do individual response to the program in terms of their decision to work or to enjoy more leisure unanimously the same across leisure type characteristic individuals? In a heterogeneous constructed labor search market we derive that introduction of the UI program increases the wage gap between the different individuals when the program impacts the productivity of firm positively. In an empirical investigation of the impact of unemployment benefits on the duration of unemployment using a job search model, we specify a distribution of duration of unemployment that we estimate using maximum likelihood estimation and find that there is in the National Longitudinal Survey of Youth (NLSY 97) there are 3 types of individuals and the type of leisure individuals present an adverse response to the program: An increase in UI for the highest leisure type leads to a longer duration of unemployment. Whereas the lowest values of leisure do not tend to have an extended duration of unemployment from a positive change in UI. Finally, the response for the type 2 individuals is completely ambiguous as it could either see them having a prolonged duration of unemployment or a shortened period with no work. So a selective increase in unemployment insurance to those with a relatively low value of leisure may decrease the equilibrium rate of unemployment. The second part of the dissertation focuses on modeling money demand and shocks in Cote D'Ivoire for the period of 1960-2009. Unlike Drama and Yao (2010) our result suggests M1 is not in a long-run equilibrium with its determinants real income and expected inflation and therefore unstable. However, the broad definition M2 is cointegrated with its long-run determinants and it is therefore the most appropriate definition of money for the Cote D'Ivoire economy. As a consequence M2 can be used as an alternative to the interest rate as a long run monetary policy instrument.
12

Likelihood-based testing and model selection for hazard functions with unknown change-points

Williams, Matthew Richard 03 May 2011 (has links)
The focus of this work is the development of testing procedures for the existence of change-points in parametric hazard models of various types. Hazard functions and the related survival functions are common units of analysis for survival and reliability modeling. We develop a methodology to test for the alternative of a two-piece hazard against a simpler one-piece hazard. The location of the change is unknown and the tests are irregular due to the presence of the change-point only under the alternative hypothesis. Our approach is to consider the profile log-likelihood ratio test statistic as a process with respect to the unknown change-point. We then derive its limiting process and find the supremum distribution of the limiting process to obtain critical values for the test statistic. We first reexamine existing work based on Taylor Series expansions for abrupt changes in exponential data. We generalize these results to include Weibull data with known shape parameter. We then develop new tests for two-piece continuous hazard functions using local asymptotic normality (LAN). Finally we generalize our earlier results for abrupt changes to include covariate information using the LAN techniques. While we focus on the cases of no censoring, simple right censoring, and censoring generated by staggered-entry; our derivations reveal that our framework should apply to much broader censoring scenarios. / Ph. D.
13

Regras de preços e efeitos de política monetária e fiscal

Almeida, Iana Ferrão de 26 April 2011 (has links)
Submitted by Iana Ferrão de Almeida (ianaferrao@gmail.com) on 2014-06-08T20:33:55Z No. of bitstreams: 1 Tese_IanaFerrao_EPGE.pdf: 2113311 bytes, checksum: 30e81ca8e6bb34635aae1ecbb26fba77 (MD5) / Approved for entry into archive by ÁUREA CORRÊA DA FONSECA CORRÊA DA FONSECA (aurea.fonseca@fgv.br) on 2014-06-09T13:54:07Z (GMT) No. of bitstreams: 1 Tese_IanaFerrao_EPGE.pdf: 2113311 bytes, checksum: 30e81ca8e6bb34635aae1ecbb26fba77 (MD5) / Approved for entry into archive by Marcia Bacha (marcia.bacha@fgv.br) on 2014-06-11T11:35:34Z (GMT) No. of bitstreams: 1 Tese_IanaFerrao_EPGE.pdf: 2113311 bytes, checksum: 30e81ca8e6bb34635aae1ecbb26fba77 (MD5) / Made available in DSpace on 2014-06-11T11:36:52Z (GMT). No. of bitstreams: 1 Tese_IanaFerrao_EPGE.pdf: 2113311 bytes, checksum: 30e81ca8e6bb34635aae1ecbb26fba77 (MD5) Previous issue date: 2011-04-26 / Esta tese é composta por três ensaios, dois deles analisam regras de preços e o outro faz uma análise de política fiscal. Cada ensaio forma um capítulo da tese. No primeiro capítulo, acrescentamos heterogeneidade a um modelo de regras de preços endógenas dependentes do tempo para analisar os efeitos reais de uma política de desinflação em um ambiente de credibilidade imperfeita. Inicialmente avaliamos os custos da desinflação em uma economia onde a credibilidade é exógena. Depois, relaxamos essa hipótese permitindo que os agentes atualizem suas crenças sobre o tipo de policymaker com que se deparam. Como resultado, em ambos os casos, a heterogeneidade amplia os efeitos reais de uma política de desinflação. Em seguida, mostramos que o modelo calibrado replica bem, e melhor do que o modelo com homogeneidade entre os agentes, a dinâmica do produto e da inflação durante a política de desinflação de Volcker. O segundo capítulo introduz uma especificação geral para hazard function com que se deparam os price setters. Diferentes especificações da hazard function podem levar a resultados muito distintos da dinâmica agregada da economia, mesmo quando as durações de preços são as mesmas entre diferentes especificações de hazard functions. Este resultado vale tanto para economias homogêneas quanto heterogêneas. O terceiro capítulo analisa os efeitos dos choques de gastos do governo sobre a dinâmica do consumo privado em um modelo DSGE (Dynamic Stochastic General Equilibrium) Novo-keynesiano com uma pequena economia aberta. Incorporamos ao modelo consumidores não-ricardianos e mostramos que a presença desse tipo de consumidor além de não evitar a queda do consumo privado, a intensifica depois de um curto espaço de tempo. Analisamos também a sensibilidade da dinâmica do consumo a diferentes graus de abertura da economia, a parâmetros de preferências e de políticas.
14

Neparametrické odhady rozdělení doby přežití / Nonparametric estimations in survival analysis

Svoboda, Martin January 2009 (has links)
This work introduces nonparametric models which are used in time to event data analysis. It is focused on applying these methods in medicine where it is called survival analysis. The basic techniques and problems, which can appear in survival analysis, are presented and explained here. The Kaplan -- Meier estimator of survival function is discussed in the main part. This is the most frequented method used for estimating the survival function in patients who have undergone a specific treatment. The Kaplan -- Meier estimator is also a common device in the statistical packets. In addition to estimation of survival function, the estimation of hazard function and cumulative hazard function is presented. The hazard function shows the intensity of an individual experiencing the particular event in a short time period. Special problems occur when analyzing time to event data. A distinctive feature, often present in such data, is known as censoring. That is the situation when the individual does not experience the event of interest at the time of study. The thesis covers also an empiric part, where the results of an analysis of patients with the larynx carcinoma diagnosis are shown. These patients were treated in a hospital located in České Budějovice. This analysis is based on a theory presented in the previous chapters.
15

Modelagem conjunta de dados longitudinais e de sobrevivência para avaliação de desfechos clínicos do parto / Joint modeling of longitudinal and survival data to evaluate clinical outcomes of labor.

Maiorano, Alexandre Cristovao 06 December 2018 (has links)
Pelo fato da maioria das mortes e morbidades associadas à gravidez ocorrerem em torno do parto, a qualidade do cuidado nesse período é crucial para as mães e seus bebês. Para acompanhar as mulheres nessa etapa, o partograma tem sido a ferramenta mais utilizada nas últimas décadas e, devido à sua simplicidade, é frequentemente usado em países com baixa e média renda. No entanto, sua utilização é altamente questionada devido à ausência de evidências que justifiquem uma contribuição ao parto. Para melhorar a qualidade do parto nessas circunstâncias, o projeto BOLD tem sido desenvolvido com o intuito de reduzir a ocorrência de problemas indesejados e com a finalidade desenvolver uma ferramenta moderna, chamada de SELMA, que projetase como uma alternativa ao partograma. Com a finalidade de associar características fixas e dinâmicas avaliadas no parto e identificar quais elementos intra parto podem ser utilizados como gatilhos para realização de uma intervenção e, assim, prevenir um desfecho indesejado, propomos nesta tese a utilização de modelos de sobrevivência com covariáveis dependentes do tempo. Inicialmente, consideramos a modelagem de dados longitudinais e de sobrevivência utilizando funções de risco paramétricas flexíveis. Nesse caso, propomos a utilização de cinco generalizações da distribuição Weibull, da distribuição Nagakami e utilizamos um procedimento geral de seleção de modelos paramétricos usuais via distribuição Gamma generalizada, inédito na modelagem conjunta. Realizamos um extenso estudo de simulação para avaliar as estimativas de máxima verossimilhança e os critérios de discriminação. Além disso, a própria natureza do parto nos leva a um contexto de eventos múltiplos, nos remetendo à utilização dos modelos multiestados. Eles são definidos como modelos para um processo estocástico que a qualquer momento ocupa um conjunto discreto de estados. De uma forma geral, são os modelos mais comuns para descrever o desenvolvimento de dados de tempo de falha longitudinais e são frequentemente utilizados em aplicações médicas. Considerando o contexto de eventos múltiplos, propomos a inclusão de uma covariável dependente do tempo no modelo multiestados a partir de uma modificação dos dados, o que nos trouxe resultados satisfatórios e similares ao esperado na prática clínica. / As most pregnancy-related deaths and morbidities are clustered around the time of child birth, the quality of care during this period is crucial for mothers and their babies. To monitor the women at this stage, the partograph has been the central tool used in recent decades and, motivated by its simplicity, is frequently used in low-and middle-income countries. However, its use is highly questioned due to lack of evidence to justify a contribution to labor. To improve the quality of labor in these circumstances, the BOLD project has been developed in order to reduce the occurrence of pregnancy-related problems and in order to develop a modern tool, called SELMA, which is projected as an alternative to partograph. Aiming to associate fixed and dynamic characteristics evaluated in the delivery and to identify which elements can be used as triggers for performing an intervention, and thus preventing a bad outcome, this thesis proposes the use of survival models with time dependent covariates. Initially, we consider the joint modeling of survival and longitudinal data using flexible parametric hazard functions. In this sense, we propose the use of five generalizations of Weibull distribution, the Nagakami model and an inedited framework to discriminate usual parametric models via the generalized Gamma distribution, performing an extensive simulation study to evaluate the maximum likelihood estimations and the proposed discrimination criteria. Indeed, by its own nature, the birth leads us to a context of multiple events, referring to the use of multi-state models. These are models for a stochastic process which at any time occupies one of a few possible states. In general, they are the most common models to describe the development of longitudinal failure time data and are often used in medical applications. Considering this context, we proposed the inclusion of a time dependent covariate in the multi-state model using a modified version of the input data, which gave us satisfactory results similar to those expected in clinical practice.
16

Bootstrap bandwidth selection in kernel hazard rate estimation / S. Jansen van Vuuren

Van Vuuren, Stefan Jansen January 2011 (has links)
The purpose of this study is to thoroughly discuss kernel hazard function estimation, both in the complete sample case as well as in the presence of random right censoring. Most of the focus is on the very important task of automatic bandwidth selection. Two existing selectors, least–squares cross validation as described by Patil (1993a) and Patil (1993b), as well as the bootstrap bandwidth selector of Gonzalez–Manteiga, Cao and Marron (1996) will be discussed. The bandwidth selector of Hall and Robinson (2009), which uses bootstrap aggregation (or 'bagging'), will be extended to and evaluated in the setting of kernel hazard rate estimation. We will also make a simple proposal for a bootstrap bandwidth selector. The performance of these bandwidth selectors will be compared empirically in a simulation study. The findings and conclusions of this study are reported. / Thesis (M.Sc. (Statistics))--North-West University, Potchefstroom Campus, 2011.
17

Bootstrap bandwidth selection in kernel hazard rate estimation / S. Jansen van Vuuren

Van Vuuren, Stefan Jansen January 2011 (has links)
The purpose of this study is to thoroughly discuss kernel hazard function estimation, both in the complete sample case as well as in the presence of random right censoring. Most of the focus is on the very important task of automatic bandwidth selection. Two existing selectors, least–squares cross validation as described by Patil (1993a) and Patil (1993b), as well as the bootstrap bandwidth selector of Gonzalez–Manteiga, Cao and Marron (1996) will be discussed. The bandwidth selector of Hall and Robinson (2009), which uses bootstrap aggregation (or 'bagging'), will be extended to and evaluated in the setting of kernel hazard rate estimation. We will also make a simple proposal for a bootstrap bandwidth selector. The performance of these bandwidth selectors will be compared empirically in a simulation study. The findings and conclusions of this study are reported. / Thesis (M.Sc. (Statistics))--North-West University, Potchefstroom Campus, 2011.
18

Temporal gap detection in electric hearing : modelling and experiments

Geldenhuys, Tiaan Andries 23 February 2012 (has links)
To advance the understanding of electric hearing, from both a theoretical and practical perspective, the present study employs an engineering approach to examine whether a fundamental stochastic link exists between neural stimulation and perception. Through the use of custom-developed psychophysics software, temporal gap-detection experiments were carried out and compared with simulation results of a theoretical model. The results are informative, and the suggested modeling principles may be a step forward to a clearer understanding of how the hearing system perceives temporal stimuli. To enable the implementation of psycho-electric experiments involving cochlear implants, a software framework was developed for Matlab version 6.5, called the Psychoacoustics Toolbox, which can present stimuli either acoustically or (for interfacing with cochlear implants) using Cochlear Ltd. hardware. This toolbox facilitates easy setup of experiments based on extensible markup language (XML) templates, and allows for both adaptivestaircase procedures and presentation of a fixed set of stimuli to a participant. Multi-track interleaving of stimuli is also supported, as put forward by Jesteadt (1980), to allow for capturing of subjective responses (such as loudness perception). As part of this research, experiments were performed with three subjects, with a total of four cochlear implants. For the temporal gap-detection experiments, the rate of electrical stimulation varied over a range from 100 to 2700 pulses per second; both periodic stimulus sequences and stimuli reflecting a dead-time-modified Poisson process were used. Also, three spatially distinct stimulation sites were used with each implant to allow comparison among basal, central and apical cochlear responses. A biologically plausible psychophysical model (in contrast with a phenomenological one) was developed for predicting temporal gap-detection thresholds in electric hearing. The model was applied to both periodic and Poisson stimuli, but can easily be used with other kinds of stimuli. For comparison with experimental results, model predictions were made over the same range of stimulus rates. As a starting point, the model takes the neural stimuli, runs them through a neural filter, and then draws statistical interspike-interval (ISI) distribution data from the generated spikes. From the ISI statistics, psychometric curves can be calculated using the principles of Green and Swets (1966), from which predictions can be made for threshold measurements based on the percentage-correct mark for the specific experimental setup. With a model in place, simulations were executed to compare the model results with experimental measurements. In addition to the simulations, mathematical equations for the periodic types of stimuli were derived, given that numerical calculations could be made with higher computational e ciency for this kind of stimulus. These equations allowed for an investigation into the implications of varying the values of different neuron-model parameters. Clear similarities were found between the shapes of gap-threshold curves for experimental and modeled data, and qualitative links have been identified between model parameters and features recognized in threshold curves. For periodic stimuli, quantitative predictions of gap thresholds are close to experimental ones, although measured values cover a larger range. The results of experimental measurements using Poisson stimuli are generally somewhat larger than model predictions, although the shapes of the curves show resemblance. A possible explanation is that participants may find decision tasks involving Poisson stimuli, as opposed to periodic stimuli, confusing. Overall, model predictions and experimental results show close correspondence, suggesting Department of Electrical, Electronic and Computer Engineering. University of Pretoria. ii that the principles underlying the model are fundamentally correct. Copyright 2007, University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria. Please cite as follows: Geldenhuys, TA 2007, Temporal gap detection in electric hearing : modelling and experiments, MEng dissertation, University of Pretoria, Pretoria, viewed yymmdd < http://upetd.up.ac.za/thesis/available/etd-02232012-131459 / > E1091/gm / Dissertation (MEng)--University of Pretoria, 2012. / Electrical, Electronic and Computer Engineering / Unrestricted
19

Inference for Birnbaum-Saunders, Laplace and Some Related Distributions under Censored Data

Zhu, Xiaojun 06 May 2015 (has links)
The Birnbaum-Saunders (BS) distribution is a positively skewed distribution and is a popular model for analyzing lifetime data. In this thesis, we first develop an improved method of estimation for the BS distribution and the corresponding inference. Compared to the maximum likelihood estimators (MLEs) and the modified moment estimators (MMEs), the proposed method results in estimators with smaller bias, but having the same mean squared errors (MSEs) as these two estimators. Next, the existence and uniqueness of the MLEs of the parameters of BS distribution are discussed based on Type-I, Type-II and hybrid censored samples. In the case of five-parameter bivariate Birnbaum-Saunders (BVBS) distribution, we use the distributional relationship between the bivariate normal and BVBS distributions to propose a simple and efficient method of estimation based on Type-II censored samples. Regression analysis is commonly used in the analysis of life-test data when some covariates are involved. For this reason, we consider the regression problem based on BS and BVBS distributions and develop the associated inferential methods. One may generalize the BS distribution by using Laplace kernel in place of the normal kernel, referred to as the Laplace BS (LBS) distribution, and it is one of the generalized Birnbaum-Saunders (GBS) distributions. Since the LBS distribution has a close relationship with the Laplace distribution, it becomes necessary to first carry out a detailed study of inference for the Laplace distribution before studying the LBS distribution. Several inferential results have been developed in the literature for the Laplace distribution based on complete samples. However, research on Type-II censored samples is somewhat scarce and in fact there is no work on Type-I censoring. For this reason, we first start with MLEs of the location and scale parameters of Laplace distribution based on Type-II and Type-I censored samples. In the case of Type-II censoring, we derive the exact joint and marginal moment generating functions (MGF) of the MLEs. Then, using these expressions, we derive the exact conditional marginal and joint density functions of the MLEs and utilize them to develop exact confidence intervals (CIs) for some life parameters of interest. In the case of Type-I censoring, we first derive explicit expressions for the MLEs of the parameters, and then derive the exact conditional joint and marginal MGFs and use them to derive the exact conditional marginal and joint density functions of the MLEs. These densities are used in turn to develop marginal and joint CIs for some quantities of interest. Finally, we consider the LBS distribution and formally show the different kinds of shapes of the probability density function (PDF) and the hazard function. We then derive the MLEs of the parameters and prove that they always exist and are unique. Next, we propose the MMEs, which can be used as initial values in the numerical computation of the MLEs. We also discuss the interval estimation of parameters. / Thesis / Doctor of Science (PhD)
20

工資、工作配合與工作轉換之期間分析的實証研究

林建志, Lin, Steve Unknown Date (has links)
本論文將勞動市場的狀態過程說明非常完整,即是勞動者的搜尋過程、工作配合過程都是詳細介紹其理論背景,以及實証的結果。由於,在勞動市場上,由於勞動者與廠商之間的訊息不完全,造成勞動者與廠商兩者之間往往無法一拍即合,而導致勞動者可能離職他就,廠商亦可能另聘高明,於是工資變動與工作異動就成為經常看到的現象。 在本文研究勞動者工作異動的情形,在理論模型上是以工作契合理論為基礎,而要討論工作契合理論的基礎則必須先知道Lippmam與MaCall(1976)所提出的工作搜尋理論。因為工作契合理論又是以工作搜尋理論為基礎,因此在理論模型上必先討論工作搜尋理論,進而討論工作契合理論。 文中研究主要藉由民國八十五年九月高希均教授與林祖嘉教授於八十四年四月至八十五年六月期間針對民國八十一年六月專上畢業生的資料,分析全体專上畢業生、女性專畢業生與男性畢業生在工資、工作契合期間與勞工離職率決定以及動態行為決定。並且我們把這些資料分為四大類基本資料:個人背景資料、工作配合資料、人力資本資料與工作特徵及其基本資料。在工資的模型方面則利用一般的最小平方法來估計,因為假設市場上的工資分配為一常態分配,是一般實証文獻常用的。就勞工的契合期間,我們運用林祖嘉(1991)的模型,本文則用在勞動者的離職率與工作契合期間的決定。我們分為四個模型,分別是:Weibull、exponential、lognormal與logistic四種分配。在勞工離職方面我們運用了Cox(1972)的比例危險率模型,Lynch(1991)首次將之運用在勞動者工作異動的決定,除此之外,我們也進一步的討論工資、工作契合期間與離職率的進一步的動態的分析。

Page generated in 0.2139 seconds