Spelling suggestions: "subject:"curvival 2analysis"" "subject:"curvival 3analysis""
691 |
購屋搜尋行為之探討-搜尋期間與管道、個人認知與預期、租買經驗之分析 / Homebuyers' search behaviors-An analysis of search duration and channels, individual price perception and expectation, and prior renting and buying experience周美伶 Unknown Date (has links)
本研究針對房屋本身具有高度異質性、昂貴性、耐久性、低重複購買頻率、消費與投資雙重性等特性,使得一般行銷研究結果,套用在解釋購屋搜尋行為可能產生問題。但以往房屋市場之搜尋行為研究,多著重價格之搜尋,且忽略其他管道使用情形,難以全面窺視購屋者之搜尋行為,故本研究依據前述房屋主要的幾個特性,藉由回顧搜尋行為相關理論與文獻提出研究假說,透過存活分析法與多項邏輯特模型進行實證,重新審視購屋者之搜尋行為。
首先,考量房屋不同於一般消費性商品之特性,可能使得一般行銷研究結果,套用在解釋購屋搜尋行為時產生問題,故以存活分析探討這些特性對搜尋期間可能造成的影響。研究結果發現購屋目的顯著影響搜尋期間,且購屋者在搜尋時會同時重視產品與價格,其重視程度增加將拉長搜尋期間,而時間壓力對搜尋期間影響則不顯著。最後,透過仲介購屋之購屋者,因為可供考慮物件增多,且服務費用多寡與取得物件數量無關,因此,其搜尋期間較自行搜尋者增加。研究結果在實務上的意涵為,當景氣處於較低迷時,業者應多提供購屋者關於產品之外部資訊,特別是與家戶切身相關之房屋資訊,並極力避免價格戰,而仲介業者也應重新檢討目前的服務方式,以期達成協助購屋者迅速成交之企業訴求。
而房屋的消費與投資雙重性與經驗財之特性,使得個人對房價的認知與預期的差異,影響其搜尋期間,而產生有限理性之決策。本研究以存活分析探討個人目前房價認知與未來房價預期差異對搜尋期間的影響。結果顯示定錨效應確實影響購屋搜尋行為,在納入個人目前房價認知後,自住者對房價看法看漲者參考點較高,搜尋期間縮短,反之,看跌者搜尋期間較長。也就是對目前房價看法越樂觀,但對一年後房價看法悲觀者,其參考點向下修正幅度越高。研究結果主要貢獻在釐清購屋搜尋行為為有限理性決策,房價看法應由目前房價認知與未來房價預期共同構成,且未來房價預期有加強定錨效應的情況。
除前述特性以外,房屋尚有低重複購買頻率的特性,雖然相關研究證實經驗確實會影響搜尋期間,卻忽略租屋經驗可能造成的影響,特別是租屋搜尋與購屋搜尋有部分相似,但以往卻少有研究討論,更遑論深入分析兩種經驗個別對購屋搜尋行為的影響。本研究透過存活分析法,去探討購屋經驗、租屋經驗,對預售屋、成屋購屋搜尋行為的影響。研究結果顯示,就購買成屋部分,相對於無任何經驗之購屋搜尋者,租屋經驗與搜尋期間為正向關係,購屋經驗與搜尋期間為負向關係。而仲介服務則有助於提升中度經驗者之經驗水準,縮短其搜尋期間。就購買預售屋部分,僅購屋經驗有顯著負向影響,但租屋經驗似乎難以延伸至預售屋之購買。研究結果主要貢獻在於,釐清先前租屋經驗確實可延伸至本次的購屋搜尋,而購屋經驗對購屋搜尋期間之影響為負向,租屋經驗對搜尋期間影響為正向,且兩者對預售屋、成屋之購屋搜尋期間影響不同,故在進行分析與研究結論的引用時,應予以分別討論。
除探討搜尋期間外,本研究更針對含個人來源與商業來源不同管道對搜尋期間之影響進行討論。研究結果顯示,購屋者資訊搜尋管道之選擇,以商業來源與混合來源居多,且常使用含個人來源搜尋之購屋者,搜尋期間較有使用商業來源者短。此結果表示對購屋者而言,搜尋管道並非互斥,購屋者會努力使用不同管道去搜尋資訊,以降低決策之不確定性,且隱含購屋者對商業來源有一定程度的不信任。因此,建議業者除維持現有行銷管道,應加強口碑行銷,以提升溝通效率。 / House is a durable good with heterogeneity, expensiveness, and low liquidity, and also an investment and consumption product. Those characteristics made housing search behavior not to be applied appropriately from the general marketing research results. This dissertation employs search theory, behavior theory, survival analysis, and multinomial logit model to study four relative essays.
The first essay tries to explain the difference between houses and consumer goods. Our research tested three hypotheses by questionnaires which investigated the person who is in charge family livelihood and bought house during 1998 and 2002 in Tai-chung. The three hypotheses have been confirmed. First, because the house is a good important to all family members, the search duration of the houses for self-living is longer than those for investment. Second, not only the price differences affect the search duration but also the product differences do; however, the influences of time pressure and purchase experience on search duration are not significant. Last but not least, because house searchers have more house selling information through brokers and no additional cost from visiting one more house, they prefer to extend their search duration compared to those who search by himself. The meaning of our research in practice is that the house seller should provide more product information to house buyer and give every effort to avoid price war.
The second essay is to discuss how house searchers measure search costs and benefits with utility or price if they are rational decision-makers. We develop a hypothesis and a search model of indefinite and sampling without recall which integrates the studies on worker search, prospect theory, and search behavior. The data comes from “Taiwan Housing Demand Survey” and includes home-buyers and home-searchers from 2003Q1 to 2003Q4. The result shows that the buyers are bounded rational, and the stopping ratio has time dependency. The buyers tend to search for minimum price during the economic fluctuation. When the consumption buyers have a higher housing price perception, they also have a higher reference points, it makes them shorten their search duration. And their price expectation would enhance the anchoring effect. Searching more can bring the buyers benefits mainly from utility increase but not price discount. Therefore, it seems reasonable to subsume individual price perception and expectation when analyzing home-buyers’ search behaviors.
The third essay focuses on the effects of homebuyers’ buying and renting experience on their search behaviors. The data also comes from “Taiwan Housing Demand Survey” and includes home-buyers and home-searchers from 2003Q1 to 2004Q1. The results show that our two hypotheses are supported. The effects of homebuyers’ prior house buying experience on search duration are different from their prior house renting experience. While buying a pre-sale or existing house, the buyers with renting experience have longer duration than the buyer with buying or non-experience. In addition, only buying experiences of homebuyers have positive relation with the search duration of pre-sale houses.
The final essay has two goals. One is to explore how homebuyers choose information sources. The other is to examine the relationship between information sources and search duration. The data comes from “Taiwan Housing Demand Survey” and includes the questionnaires from sampled homebuyers of 2005Q3. We also use survival model to test our hypotheses. The results show that homebuyers incline to use the commercial sources and mix sources. Even the limit of personal source, it can reduce the buyers’ search duration. Therefore, in order to improve transaction efficiency, the firms should make use of their existing marketing channels and try to build good word-of-mouth as well.
|
692 |
La démographie des centenaires québécois : validation des âges au décès, mesure de la mortalité et composante familiale de la longévitéBeaudry-Godin, Mélissa 06 1900 (has links)
L’explosion récente du nombre de centenaires dans les pays à faible mortalité n’est pas étrangère à la multiplication des études portant sur la longévité, et plus spécifiquement sur ses déterminants et ses répercussions. Alors que certains tentent de découvrir les gènes pouvant être responsables de la longévité extrême, d’autres s’interrogent sur l’impact social, économique et politique du vieillissement de la population et de l’augmentation de l’espérance de vie ou encore, sur l’existence d’une limite biologique à la vie humaine. Dans le cadre de cette thèse, nous analysons la situation démographique des centenaires québécois depuis le début du 20e siècle à partir de données agrégées (données de recensement, statistiques de l’état civil, estimations de population). Dans un deuxième temps, nous évaluons la qualité des données québécoises aux grands âges à partir d’une liste nominative des décès de centenaires des générations 1870-1894. Nous nous intéressons entre autres aux trajectoires de mortalité au-delà de cent ans. Finalement, nous analysons la survie des frères, sœurs et parents d’un échantillon de semi-supercentenaires (105 ans et plus) nés entre 1890 et 1900 afin de se prononcer sur la composante familiale de la longévité.
Cette thèse se compose de trois articles. Dans le cadre du premier, nous traitons de l’évolution du nombre de centenaires au Québec depuis les années 1920. Sur la base d’indicateurs démographiques tels le ratio de centenaires, les probabilités de survie et l’âge maximal moyen au décès, nous mettons en lumière les progrès remarquables qui ont été réalisés en matière de survie aux grands âges. Nous procédons également à la décomposition des facteurs responsables de l’augmentation du nombre de centenaires au Québec. Ainsi, au sein des facteurs identifiés, l’augmentation de la probabilité de survie de 80 à 100 ans s’inscrit comme principal déterminant de l’accroissement du nombre de centenaires québécois.
Le deuxième article traite de la validation des âges au décès des centenaires des générations 1870-1894 d’origine canadienne-française et de confession catholique nés et décédés au Québec. Au terme de ce processus de validation, nous pouvons affirmer que les données québécoises aux grands âges sont d’excellente qualité. Les trajectoires de mortalité des centenaires basées sur les données brutes s’avèrent donc représentatives de la réalité. L’évolution des quotients de mortalité à partir de 100 ans témoigne de la décélération de la mortalité. Autant chez les hommes que chez les femmes, les quotients de mortalité plafonnent aux alentours de 45%.
Finalement, dans le cadre du troisième article, nous nous intéressons à la composante familiale de la longévité. Nous comparons la survie des frères, sœurs et parents des semi-supercentenaires décédés entre 1995 et 2004 à celle de leurs cohortes de naissance respectives. Les différences de survie entre les frères, sœurs et parents des semi-supercentenaires sous observation et leur génération « contrôle » s’avèrent statistiquement significatives à un seuil de 0,01%. De plus, les frères, sœurs, pères et mères des semi-supercentenaires ont entre 1,7 (sœurs) et 3 fois (mères) plus de chance d’atteindre 90 ans que les membres de leur cohorte de naissance correspondante. Ainsi, au terme de ces analyses, il ne fait nul doute que la longévité se concentre au sein de certaines familles. / The recent rise in the number of centenarians within low mortality countries has led to multiple studies conducted on longevity, and more specifically on its determinants and repercussions. Some are trying to identify genes that could be responsible for extreme longevity. Others are studying the social, economic and political impact of the rise in life expectancy and population aging, or questioning themselves about the existence of a biological limit to the human life span. In this thesis, we first study the demographic situation of centenarians from Quebec using aggregated data (census data, vital statistics, and population estimations). Then, we evaluate the quality of Quebec data at the oldest ages using the death records of centenarians belonging to the 1870-1894 birth cohorts. We are particularly interested in the mortality trajectories beyond 100 years old. Finally, we analyze the survival of siblings and parents of a semi-supercentenarians (105 years and over) sample in order to assess the familial component of longevity.
The thesis is divided into three articles. In the first article, we study the evolution of the centenarian population from the 1920s in Quebec. With demographic indicators such as the centenarian ratio, the survival probabilities and the maximal age at death, we try to demonstrate the remarkable progress realised in old age mortality. We also analyze the determinants of the increase in the number of centenarians in Quebec. Among the factors identified, the improvement in late mortality is the main determinant of the increase of the number of centenarians in Quebec.
The second article deals with the validation of the ages at death of French-Canadian centenarians born in Quebec between 1870-1894. The validation results confirm that Quebec data at the highest ages at death are of very good quality. Therefore, the measure of centenarian mortality based on all death records is representative of the true trends. The evolution of age-specific life table death rates beyond 100 years old assesses the mortality deceleration at the highest ages. Among men and women, the death rates reach a plateau at around 45%.
Finally, in the third article, we study the familial predisposition for longevity. We compare the survival probabilities of siblings and parents of semi-supercentenarians deceased between 1995 and 2004 to those of their birth cohort-matched counterparts. The survival differences between the siblings and parents of semi-supercentenarians and their respective birth cohorts are statistically significant at a 0,01% level of significance. The siblings and parents have a 1,7 to 3 times greater probability of survival from age 50 to 90 then members of their respective birth cohorts. These findings support the existence of a substantial familial component to longevity.
|
693 |
房地產仲介市場交易行為之研究李春長 Unknown Date (has links)
近年來台灣房屋仲介市場可說發展的相當快速,許多賣方和買方常基於成本的考量,而透過仲介公司來買賣房屋。當然,委託給仲介公司來買賣房屋。其中可能主要因素即著眼於縮短交易的時間或者是提高成交的機率,本研究的整個重心將圍繞在賣方的訂價高低(底價)賣屋動機與房屋的屬性來解釋銷售期間和成交機率的高低,並且本文擬得用搜尋理論和代理人理論來詮釋房地產仲介市場的交易行為。
本研究主要分成六章,第一章為緒論。第二章從搜尋理論的觀點來研究訂價與成交價和銷售期間的關係。我們從賣方決定訂價之後,買方會依據訂價要求折扣率,而賣方也會決定一願意給買方的最小折扣率的角度出發,來建構房屋搜尋模型。由理論得知,賣方折扣率底線愈大,則預期銷售期間愈長;銷售期間愈長,則預期賣方折扣率底線愈低;搜尋成本愈大,則預期賣方折扣率底線愈小;買方要求折扣率分配的平均數愈大,則預期賣方折扣率底線愈大。在實證研究上,利用信義房屋仲介公司所提供的資料(1990-1993),以銷售期間和訂價相對於成交價的比例為應變數,採用聯立方程式的方式來估計分析。實證結果發現上述幾項論點皆獲得驗證與支持。
第三章模型同時考慮賣方和仲介業的行為,一方面說明賣方搜尋成本和買方所要求折扣率分配對銷售期間和成交機率的影響,另一方面提供未來修正理論模型的基礎。利用存活分析法(survival analysis)來估計銷售期間。一方面,我們要探究影響房屋交易之銷售期間的可能原因為何?另一方面,我們也想了解銷售期間是否具有時間相依性(time dependence),是否銷售期間越長者,越不易賣出?或越容易賣出?
第四章利用logit模型來估計房屋成交的機率,由於危險模型為純粹之計量模型,而非由理論模型所導出,所以分配之假設將限制其估計模型與理論的關連性,而logit模型則無此問題,因此本研究亦嘗試用logit模型來做估計。
第五章利用代理與搜尋模型來分析賣方和仲介業間的行為關係。不同的仲介收費制度--固定百分比收費(fixed-percentage commission)、定額制(flat-fee)、代銷(consignment sale),對雙方利益衝突的衝擊為何,是否潛藏著嚴重的道德危險(moral hazard)。透過本篇的分析,可瞭解何以固定百分比收費制度是台灣房屋仲介市場的收費趨勢。最後一章為結論與未來研究方向。 / This paper employs search theory to study the re1ationships between the list price, the transaction price,and marketing duration in the Taiwan real estate market.
Theoretically, buyer uses the a set of criteria together,with the listing price to develop an offer which is based on a (guest) minimurn discount rate guideline form the listing price to determine a price that will be acceptable to the seller. We attempt to describe the impact of pricing strategies (seller's minimum discoun rate) and marketing duration by incorporating the minimurn discount rate in a search model. The derived search model indicates the presence of a positbive retalionship between the minimurn discount rate and marketing duration;an inverse relationship beteween marketing duration and munimurn discount rate; an inverse relationship between searching costs and mininurn discount rate; and a positive relationship between the buyer's average discount rate of distribution function and minimum discount rate.
The study uses data collected during the l990-1993 time period and provided by Hsin Yi Realty Co., with the dependent variables being the marketing duration and the ratio of the listing price to the transaction price. A simultaneous equation is developed and used to analyze the following hypotheses: firstly, the higher the ratio of the listing price against the transaction price, the longer it takes to reach the marketing duraion; secondly, the longer it takes to reach the marketing duraion, the higher the ratio of the listing price against the transaction price; thirdly, the longer the period of consignment, the longer it takes to reach the marketing duration; and the fourth,if the seller is not in a hurry to sell,then the marketing duration becomes longer. Our empirical findings verify and support all the above stated analyses.
|
694 |
Studies in health economics : modelling and data analysis of costs and survival /Ekman, Mattias, January 2002 (has links)
Diss. Stockholm: Handelshögsk., 2002.
|
695 |
Relevância dos aspectos nutricionais na sobrevida de pacientes com Doença do Neurônio Motor / Relevance of nutrition on survival of patients with Motor Neurone DiseaseStanich, Patricia [UNIFESP] 25 May 2011 (has links) (PDF)
Made available in DSpace on 2015-07-22T20:50:46Z (GMT). No. of bitstreams: 0
Previous issue date: 2011-05-25 / Stanich P. Relevância dos aspectos nutricionais na sobrevida de pacientes com Doença do Neurônio Motor. São Paulo; 2001. [Tese de Doutorado- Escola Paulista de Medicina – Universidade Federal de São Paulo]. Objetivos. Avaliar o efeito dos aspectos nutricionais na sobrevida de pacientes com Doença do Neurônio Motor (DNM) e apresentar as variáveis preditivas para a indicação de terapia nutricional enteral, por gastrostomia endoscópica percutânea (GEP). Material e Métodos. Foi um estudo longitudinal tipo coorte retrospectiva, de 2000 a 2008, e a casuística constituída por 128 pacientes com DNM. Variáveis clínicas, nutricionais e respiratórias foram analisadas. As análises foram conduzidas adotando-se a sobrevida como variável dependente. A sobrevida foi avaliada pela Curva de Kaplain - Meier. As variáveis que apresentaram nível de significância de 20% (p< 0,20) foram selecionadas para o modelo de regressão proporcional de Cox. Resultados. Cento e onze pacientes realizaram a gastrostomia, sendo 59 com a forma apendicular (ELA) e 52 com a forma bulbar (PBP). A desnutrição estava presente em 32% da população antes da GEP, com maior frequência nos pacientes com ELA. O tempo de sobrevida após a GEP foi de 11 meses para os pacientes com PBP e 16 meses para ELA (p< 0,05). As variáveis associadas à sobrevida foram: precocidade na indicação da GEP; redução de CVF %, idade e IMC antes da GEP (hazard ratio de 0, 254 e p = 0, 007) para os pacientes com ELA e exclusão da alimentação por via oral e traqueostomia (hazard ratio de 0, 345 e p= 0, 014) para os pacientes com PBP. Ao final do modelo as variáveis mais associadas com a sobrevida foram precocidade na indicação de GEP, exclusão da alimentação por via oral, para os pacientes com PBP e estado nutricional antes da GEP para os pacientes com ELA. Conclusões. A inserção precoce de gastrostomia endoscópica percutânea, a partir do momento diagnóstico, foi fator protetor para a sobrevida dos pacientes. A desnutrição foi fator prognóstico ruim, especialmente para os pacientes com ELA. Vigilância nutricional durante a evolução da doença pode melhorar os resultados quando o objetivo é aumentar a sobrevida de pacientes com DNM/ELA. / Aims. To evaluate the effect of nutrition on survival of patients with Motor Neurone Disease (MND) and present the predictor variables for indications of nutritional therapy, percutaneous endoscopic gastrostomy (PEG). Methods. It was a retrospective longitudinal cohort study, from 2000 to 2008, and the sample consisted of 128 patients with MND. The variables investigated were clinical, nutritional and respiratory were analysed. Analyses were conducted by adopting the survival as the dependent variable. The survival curve was evaluated by Kaplain - Meier. The variables that had a significance level of 20% (p <0.20) were selected for the proportional regression model of Cox. Results. One hundred and eleven patients underwent gastrostomy, and 59 limb onset (ALS) and 52 with bulbar onset (PBP). Malnutrition was present in 32% of the population before PEG, most frequently in patients with limb onset. The survival time after PEG was 10.5 months for patients with PBP and 16 months for ALS (p <0.05). Variables associated with survival were: early indication in the PEG, for ALS and PBP; reduction of FVC% and BMI before PEG (hazard ratio of 0, 254, p = 0, 007) for patients with limb onset and exclusion of oral feeding and tracheostomy (hazard ratio of 0, 345, p = 0, 014) for patients with bulbar onset. Conclusions. Early insertion of percutaneous endoscopic gastrostomy, from the time diagnosis was a protective factor for patient survival. Malnutrition was a bad prognostic factor, especially for patients with limb onset. Nutritional surveillance for disease progression may improve results when the goal is to increase the survival of patients with MND / ALS. / TEDE / BV UNIFESP: Teses e dissertações
|
696 |
Um modelo de risco proporcional dependente do tempoParreira, Daniela Ribeiro Martins 30 March 2007 (has links)
Made available in DSpace on 2016-06-02T20:06:00Z (GMT). No. of bitstreams: 1
1662.pdf: 571364 bytes, checksum: 6091268473b4a7cb920748fd364c2a99 (MD5)
Previous issue date: 2007-03-30 / Survival data analysis models is used to study experimental data where, normally,
the variable "answer"is the time passed until an event of interest. Many authors do prefer
modeling survival data, in the presence of co-variables, by using a hazard function - which
is related with its interpretation. The Cox model (1972) - most commonly used by the
authors - is applicable when the fail rates are proportional. This model is very flexible and
used in the survival analysis. It can be easily extended to, for example, incorporate the
time-dependent co-variables. In the present work we propose a proportional risk model
which incorporates a time-dependent parameter named "time-dependent proportional risk
model". / A análise de sobrevivência tem por objetivo estudar dados de experimento em que a
variável resposta é o tempo até a ocorrência de um evento de interesse. Vários autores têm
preferido modelar dados de sobrevivência na presença de covariáveis por meio da função
de risco, fato este relacionado à sua interpretação. Ela descreve como a probabilidade
instantânea de falha se modifca com o passar do tempo. Nesse contexto, um dos modelos
mais utilizados é o modelo de Cox (Cox, 1972), onde a suposição básica para o seu uso
é que as taxas de falhas sejam proporcionais. O modelo de riscos proporcionais de Cox
é bastante flexível e extensivamente usado em análise de sobrevivência. Ele pode ser
facilmente estendido para incorporar, por exemplo, o efeito de covariáveis dependentes
do tempo. Neste estudo, propõe-se um modelo de risco proporcional, que incorpora um
parâmetro dependente do tempo, denominado modelo de risco proporcional dependente
do tempo. Uma análise clássica baseada nas propriedades assintóticas dos estimadores de
máxima verossimilhança dos parâmetros envolvidos é desenvolvida, bem como um estudo
de simulação via técnicas de reamostragem para estimação intervalar e testes de hipóteses
dos parâmetros do modelo. É estudado o custo de estimar o efeito da covariável quando
o parâmetro que mede o efeito do tempo é considerado na modelagem. E, finalizando,
apresentamos uma abordagem do ponto de vista Bayesiano.
|
697 |
Inferência do valor de mercado de lotes urbanos. Estudo de caso : município de São Carlos (SP)Ferraudo, Guilherme Moraes 13 November 2008 (has links)
Made available in DSpace on 2016-06-02T20:06:01Z (GMT). No. of bitstreams: 1
2197.pdf: 1979337 bytes, checksum: 0091ce4138b8a98277d9af0d5d2aa788 (MD5)
Previous issue date: 2008-11-13 / Universidade Federal de Sao Carlos / In this dissertation we present a regression modelling proposal for modelling the market prices of urban batches at São Carlos city (SP), over the year of 2005. Usual regression modelling and survival techniques, with left censoring, are considered. A simulation study exames the coverage probabilities the asymptotic confidence for the parameters of the considered modelling. / Nesta disertação apresentaremos uma proposta de um modelo de equação de regressão representativa para a formação do valor de mercado dos lotes urbanos do município de São Carlos, SP, ano de 2005, visando à criação de Plantas de Valores Genéricos (PVG) utilizando as técnicas de: Modelos Lineares Usuais (erros normais e variância constante), estes
amplamente utilizados, e a Análise de Sobrevivência com censura à esquerda. Após o ajuste, as duas metodologias são comparadas e testadas num estudo de simulação onde examinamos a probabilidade de cobertura de alguns parâmetros envolvidos na regressão.
|
698 |
Modelo logístico generalizado dependente do tempo com fragilidadeMilani, Eder Angelo 11 February 2011 (has links)
Made available in DSpace on 2016-06-02T20:06:04Z (GMT). No. of bitstreams: 1
3437.pdf: 1348932 bytes, checksum: d4b8cd2d1775831eeea609373f32648d (MD5)
Previous issue date: 2011-02-11 / Universidade Federal de Minas Gerais / Several authors have preferred to model survival data in the presence of covariates through the hazard function, a fact related to its interpretation. The hazard function describes as the instantaneous average of failure changes over time. In this context, one of the most used models is the Cox s model (1972), in which the basic supposition for its use is that the ratio of the failure rates, of any two individuals, are proportional. However, experiments show that there are survival data which can not be accommodated by the Cox s model. This fact has been determinant in the developing of several types of non-proporcional hazard models. Among them we mention the accelerated failure model (Prentice, 1978), the hybrid hazard model (Etezadi-Amoli and Ciampi, 1987) and the extended hybrid hazard models (Louzada-Neto, 1997 and 1999). Mackenzie (1996) proposed a parametric family of non-proportional hazard model called generalized time-dependent logistic model - GTDL. This model is based on the generalization of the standard logistic function for the time-dependent form and is motivated in part by considering the timeeffect in its setting and, in part by the need to consider parametric structure. The frailty model (Vaupel et al., 1979, Tomazella, 2003, Tomazella et al., 2004) is characterized by the use of a random effect, ie, an unobservable random variable, which represents information that or could not or were not collected, such as, environmental and genetics factors, or yet information that, for some reason, were not considered in the planning. The frailty variable is introduced in the modeling of the hazard function, with the objective of control the unobservable heterogeneity of the units under study, including the dependence of the units that share the same hazard factors. In this work we considered an extension of the GTDL model using the frailty model as an alternative to model data which does not have a proportional hazard structure. From a classical perspective, we did a simulation study and an application with real data. We also used a Bayesian approach to a real data set. / Vários autores têm preferido modelar dados de sobrevivência na presença de covariáveis por meio da função de risco, fato este relacionado à sua interpretação. A função de risco descreve como a taxa instantânea de falha se modifica com o passar do tempo. Neste contexto, um dos modelos mais utilizados é o modelo de Cox (1972) sendo que a suposição básica para o seu uso é que a razão das taxas de falhas, de dois quaisquer indivíduos, sejam proporcionais. Contudo, experiências mostram que existem dados de sobrevivência que não podem ser acomodados pelo modelos de Cox. Este fato tem sido determinante no desenvolvimento de vários tipos de modelos de risco não proporcional. Entre eles podemos citar o modelo de falha acelerado (Prentice, 1978), o modelo de risco híbrido (Etezadi-Amoli e Ciampi, 1987) e os modelos de risco híbrido estendido (Louzada- Neto, 1997 e 1999). Mackenzie (1996) propôs uma nova família paramétrica de modelo de risco não proporcional intitulado modelo de risco logístico generalizado dependente do tempo (Generalized time-dependent logistic model-GTDL). Este modelo é baseado na generalização da função logística padrão para a forma dependente do tempo e é motivado em parte por considerar o efeito do tempo em seu ajuste e, em parte pela necessidade de considerar estrutura paramétrica. O modelo de fragilidade (Vaupel et al., 1979, Tomazella, 2003, Tomazella et al., 2004) é caracterizado pela utilização de um efeito aleatório, ou seja, de uma variável aleatória não observável, que representa as informações que não podem ou não foram observadas, como por exemplo, fatores ambientais e genéticos, ou ainda informações que, por algum motivo, não foram consideradas no planejamento. A variável de fragilidade é introduzida na modelagem da função de risco, com o objetivo de controlar a heterogeneidade não observável das unidades em estudo, inclusive a dependência das unidades que compartilham os mesmos fatores de risco. Neste trabalho consideramos uma extensão do modelo GTDL utilizando o modelo de fragilidade como uma alternativa para ii modelar dados que não tem uma estrutura de risco proporcional. Sob uma perspectiva Clássica, fizemos um estudo de simulação e uma aplicação com dados reais. Também utilizamos a abordagem Bayesiana para um conjunto de dados reais.
|
699 |
Modelos série de potência com excesso de zeros observáveis e latentesCoaguila Zavaleta, Katherine Elizabeth 28 September 2016 (has links)
Submitted by Aelson Maciera (aelsoncm@terra.com.br) on 2017-06-23T18:57:54Z
No. of bitstreams: 1
TeseKECZ.pdf: 1800356 bytes, checksum: a555e52c04756515d694387c471b4030 (MD5) / Approved for entry into archive by Ronildo Prado (ronisp@ufscar.br) on 2017-06-28T08:32:51Z (GMT) No. of bitstreams: 1
TeseKECZ.pdf: 1800356 bytes, checksum: a555e52c04756515d694387c471b4030 (MD5) / Approved for entry into archive by Ronildo Prado (ronisp@ufscar.br) on 2017-06-28T08:33:00Z (GMT) No. of bitstreams: 1
TeseKECZ.pdf: 1800356 bytes, checksum: a555e52c04756515d694387c471b4030 (MD5) / Made available in DSpace on 2017-06-28T08:41:16Z (GMT). No. of bitstreams: 1
TeseKECZ.pdf: 1800356 bytes, checksum: a555e52c04756515d694387c471b4030 (MD5)
Previous issue date: 2016-09-28 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / The present work's main objective is to study the significance of zeros in an observable
and latent data. In observable data set that occur excess of zeros, its common to have
sobredispersion. In this sense, the models zero-inflated power series (ZISP) were proposed
to accommodate these excesses. Specifically for the analysis of observed data, it was made
a study of gradient statistic, proposed by Terrell (2002), to test the hypotheses in relation
to inflation parameter ZISP models. This test is based on evaluation of the performance
of gradient statistic compared with the classical likelihood ratio (Wilks, 1938), score (Rao,
1948) and Wald (Wald, 1943) statistics. In addition, recently, fragility has being modeled
by discrete distributions using non-negative integers values that allows zero fragility, which
means, individuals who do not present the event of interest (fraction of zero risk). For this
type of latent data, we have proposed a new survival model induced by discrete frailty with
ZISP distribution. This proposal brings a real description of individuals without risk, because
individuals cured due to genetic factors (immune) are modeled by fraction of deterministic
zero risk, while the cured by treatment are modeled by fraction of random zero risk. In this
context, we also developed the gradient statistic to verify parameter significance of zero risk
for data modeled by fraction of deterministic zero risk. To show our proposals, we present
the results of simulation studies and applications using real data. / O presente trabalho teve como objetivo principal, estudar a significância de zeros
numa análise de dados observáveis e latentes. Nos conjuntos de dados observáveis que ocorrem
excessos de zeros, é comum a existência de sobredispersão. Neste sentido os modelos
Zero-Inflacionados Série de Potência (ZISP) foram propostos para acomodar o excesso de
zeros. Especifcamente para a análise de dados observáveis com excesso de zeros desenvolvemos
um estudo da estatística gradiente, proposta por Terrell (2002), para testar as hipóteses
em relação ao parâmetro de inflação do modelo ZISP, baseado na avaliação da performance
da estatística gradiente em comparação com as estatísticas clássicas da razão de verossimilhan
ça (Wilks, 1938), escore (Rao, 1948) e Wald (Wald, 1943). Por outro lado, recentemente
a fragilidade é modelada por distribuições discretas sob os inteiros não negativos e permite
fragilidade zero, isto é, indivíduos que não apresentam o evento de interesse (fração de risco
zero). Para este tipo dados de latentes, propusemos um novo modelo de sobrevivência induzida
por fragilidade discreta com distribuição ZISP. Essa proposta traz uma descrição mais
real dos indivíduos sem risco, pois inclui indivíduos curados devido aos fatores genéticos
(imunes) modelados como a fração de risco zero determinístico, enquanto que, os indivíduos
curados por tratamento são modelados pela fração de risco zero aleatório. Neste contexto
desenvolvemos também a estatística gradiente para verificar a significância do parâmetro de
risco zero para dados modelados pela fração de risco zero determinístico. E para completar
o desenvolvimento das propostas, apresentamos os resultados de estudos de simulação e
exemplos de aplicação com uso de dados reais.
|
700 |
Metodologia estat?stica na solu??o do problema do caixeiro viajante e na avalia??o de algoritmos : um estudo aplicado ? transgen?tica computacionalRamos, Iloneide Carlos de Oliveira 03 March 2005 (has links)
Made available in DSpace on 2014-12-17T14:55:03Z (GMT). No. of bitstreams: 1
IloneideCOR.pdf: 1010601 bytes, checksum: 76bbc04aa0a456f079121fb0d750ea74 (MD5)
Previous issue date: 2005-03-03 / The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size. / Os problemas de otimiza??o combinat?ria t?m envolvido um grande n?mero de pesquisadores na busca por solu??es aproximativas para aqueles, desde a aceita??o de que eles s?o considerados insol?veis em tempo polinomial. Inicialmente, essas solu??es eram focalizadas por meio de heur?sticas. Atualmente, as metaheur?sticas s?o mais utilizadas para essa tarefa, especialmente aquelas baseadas em algoritmos evolucion?rios. As duas principais contribui??es deste trabalho s?o: a cria??o de uma heur?stica, denominada Operon, para a constru??o de cadeias de informa??es necess?rias ? implementa??o de algoritmos transgen?ticos (evolucion?rios) utilizando, principalmente, a metodologia estat?stica - An?lise de Agrupamentos e An?lise de Componentes Principais -; e a utiliza??o de an?lises estat?sticas adequadas ? avalia??o da performance de algoritmos destinados ? solu??o desses problemas. O Operon visa construir, de forma din?mica e de boa qualidade, cadeias de informa??es a fim de promover uma busca -inteligente- no espa?o de solu??es. O Problema do Caixeiro Viajante (PCV) ? focalizado para as aplica??es que s?o realizadas com base num algoritmo transgen?tico, denominado ProtoG. Prop?e-se, tamb?m, uma estrat?gia de renova??o de parte da popula??o de cromossomos indicada pela ado??o de um limite m?nimo no coeficiente de varia??o da fun??o de adequa??o dos indiv?duos, calculado com base na popula??o. S?o propostas tr?s an?lises estat?sticas para avaliar a performance de algoritmos. A primeira ? realizada atrav?s da An?lise de Regress?o Log?stica, com base na probabilidade de obten??o da solu??o ?tima de uma inst?ncia do PCV pelo algoritmo em teste. A segunda ? realizada atrav?s da An?lise de Sobreviv?ncia, com base numa probabilidade envolvendo o tempo de execu??o observado at? que a solu??o ?tima seja obtida. A terceira ? realizada por meio da An?lise de Vari?ncia n?o param?trica, considerando o Erro Percentual da Solu??o (EPS) obtido pela percentagem em que a solu??o encontrada excede a melhor solu??o dispon?vel na literatura. Utiliza-se essa metodologia para a avalia??o da performance de quatro algoritmos, a saber: o ProtoG proposto, dois algoritmos mem?ticos e um algoritmo Simulated Annealing. Foram realizados seis experimentos, aplicados a sessenta e uma inst?ncias do PCV euclidiano, com tamanhos de at? 1.655 cidades. Os dois primeiros experimentos tratam do ajuste de quatro par?metros utilizados no algoritmo ProtoG, visando melhorar a performance do mesmo. Os quatro ?ltimos s?o utilizados para avaliar a performance do ProtoG em compara??o aos tr?s algoritmos adotados. Para essas sessenta e uma inst?ncias, conclui-se, sob testes estat?sticos, que h? evid?ncias de que o ProtoG ? superior a esses tr?s algoritmos em cinq?enta inst?ncias. Al?m disso, para as trinta e seis inst?ncias consideradas nos tr?s ?ltimos experimentos, nos quais a avalia??o da performance dos algoritmos foi realizada com base no EPS, observou-se que o ProtoG obteve EPSs m?dios menores que 1% em quase metade das inst?ncias, tendo atingido a maior m?dia para uma inst?ncia composta por 1.173 cidades, com EPS m?dio igual a 3,52%. Logo, o ProtoG pode ser considerado um algoritmo competitivo para solucionar o PCV, pois n?o ? raro serem reportados, na literatura, EPSs m?dios maiores que 10% para inst?ncias desse porte.
|
Page generated in 0.0418 seconds