Spelling suggestions: "subject:"curvival 2analysis"" "subject:"curvival 3analysis""
641 |
PREDICTIVE ANALYTICS FOR HOLISTIC LIFECYCLE MODELING OF CONCRETE BRIDGE DECKS WITH CONSTRUCTION DEFECTSNichole Marie Criner (14196458) 01 December 2022 (has links)
<p> </p>
<p>During the construction of a bridge, more specifically a concrete bridge deck, there are sometimes defects in materials or workmanship, resulting in what is called a construction defect. These defects can have a large impact on the lifecycle performance of the bridge deck, potentially leading to more preventative and reactive maintenance actions over time and thus a larger monetary investment by the bridge owner. Bridge asset managers utilize prediction software to inform their annual budgetary needs, however this prediction software traditionally relies only on historical condition rating data for its predictions. When attempting to understand how deterioration of a bridge deck changes with the influence of construction defects, utilizing the current prediction software is not appropriate as there is not enough historical data available to ensure accuracy of the prediction. There are numerical modeling approaches available that capture the internal physical and chemical deterioration processes, and these models can account for the change in deterioration when construction defects are present. There are also numerical models available that capture the effect of external factors that may be affecting the deterioration patterns of the bridge deck, in parallel to the internal processes. The goal of this study is to combine a mechanistic model capturing the internal physical and chemical processes associated with deterioration of a concrete bridge deck, with a model that is built strictly from historical condition rating data, in order to predict the changes in condition rating prediction of a bridge deck for a standard construction case versus a substandard construction case. Being able to measure the change in prediction of deterioration when construction defects are present then allows for quantifying the additional cost that would be required to maintain the defective bridge deck which is also presented. </p>
|
642 |
Deep Learning Approach for Time- to-Event Modeling of Credit Risk / Djupinlärningsmetod för överlevnadsanalys av kreditriskmodelleringKazi, Mehnaz, Stanojlovic, Natalija January 2022 (has links)
This thesis explores how survival analysis models performs for default risk prediction of small-to-medium sized enterprises (SME) and investigates when survival analysis models are preferable to use. This is examined by comparing the performance of three deep learning models in a survival analysis setting, a traditional survival analysis model Cox Proportional Hazards, and a traditional credit risk model logistic regression. The performance is evaluated by three metrics; concordance index, integrated Brier score and ROC-AUC. The models are trained on financial data from Swedish SME holding profit and loss statement and balance sheet results. The dataset is divided into two feature sets: a smaller and a larger, additionally the features are binned. The results show that DeepHit and Logistic Hazard performed the best with the three metrics in mind. In terms of the AUC score all three deep learning survival models generally outperform the logistic regression model. The Cox Proportional Hazards (Cox PH) showed worse performance than the logistic regression model on the non-binned feature sets while having more comparable results in the case where the data was binned. In terms of the concordance index and integrated Brier score the Cox Proportional Hazards model consistently performed the worst out of all survival models. The largest significant performance gain for the concordance index and AUC score was however seen by the Cox PH model when binning was applied to the larger feature set. The concordance index went from 0.65 to 0.75 and the test AUC went from 76.56% to 83.91% for the larger set to larger dataset with binned features. The main conclusions is that the neural networks models did outperform the traditional models slightly and that binning had a great impact on all models, but in particular for the Cox PH model. / Det här examensarbete utreder hur modeller inom överlevnadsanalys presterar för kreditriskprediktion på små och medelstora företag (SMF) och utvärderar när överlevnadsanalys modeller är att föredra. För att besvara frågan jämförs prestandan av tre modeller för djupinlärning i en överlevnadsanalysmiljö, en traditionell överlevnadsanalys modell: Cox Proportional Hazards och en traditionell kreditriskmodell: logistik regression. Prestandan har utvärderats utifrån tre metriker; concordance index, integrated Brier score och AUC. Modellerna är tränade på finansiell data från små och medelstora företag som innefattar resultaträkning och balansräkningsresultat. Datasetet är fördelat i ett mindre variabelset och ett större set, dessutom är variablerna binnade. Resultatet visar att DeepHit och Logistic Hazard presterar bäst baserat på alla metriker. Generellt sett är AUC måttet högre för alla djupinlärningsmodeller än för den logistiska regressionen. Cox Proportional Hazards (Cox PH) modellen presterar sämre för variabelset som inte är binnade men får jämförelsebar resultat när datan är binnad. När det gäller concordance index och integrated Brier score så har Cox PH överlag sämst resultat utav alla överlevnadsmodeller. Den största signifikanta förbättringen i resultatet för concordance index och AUC ses för Cox PH när datan binnas för det stora variabelsetet. Concordance indexet gick från 0.65 till 0.75 och test AUC måttet gick från 76.56% till 83.91% för det större variabel setet till större variabel setet med binnade variabler. De huvudsakliga slutsatserna är att de neurala nätverksmodeller presterar något bättre än de traditionella modellerna och att binning är mycket gynnsam för alla modeller men framförallt för Cox PH.
|
643 |
The Path to Global Sport Sponsorship Success: An Event History Analysis Modeling ApproachJensen, Jonathan A. 21 May 2015 (has links)
No description available.
|
644 |
Regression Modeling of Time to Event Data Using the Ornstein-Uhlenbeck ProcessErich, Roger Alan 16 August 2012 (has links)
No description available.
|
645 |
INFERENCE FOR ONE-SHOT DEVICE TESTING DATALing, Man Ho 10 1900 (has links)
<p>In this thesis, inferential methods for one-shot device testing data from accelerated life-test are developed. Due to constraints on time and budget, accelerated life-tests are commonly used to induce more failures within a reasonable amount of test-time for obtaining more lifetime information that will be especially useful in reliability analysis. One-shot devices, which can be used only once as they get destroyed immediately after testing, yield observations only on their condition and not on their real lifetimes. So, only binary response data are observed from an one-shot device testing experiment. Since no failure times of units are observed, we use the EM algorithm for determining the maximum likelihood estimates of the model parameters. Also, inference for the reliability at a mission time and the mean lifetime at normal operating conditions are also developed.</p> <p>The thesis proceeds as follows. Chapter 2 considers the exponential distribution with single-stress relationship and develops inferential methods for the model parameters, the reliability and the mean lifetime. The results obtained by the EM algorithm are compared with those obtained from the Bayesian approach. A one-shot device testing data is analyzed by the proposed method and presented as an illustrative example. Next, in Chapter 3, the exponential distribution with multiple-stress relationship is considered and corresponding inferential results are developed. Jackknife technique is described for the bias reduction in the developed estimates. Interval estimation for the reliability and the mean lifetime are also discussed based on observed information matrix, jackknife technique, parametric bootstrap method, and transformation technique. Again, we present an example to illustrate all the inferential methods developed in this chapter. Chapter 4 considers the point and interval estimation for the one-shot device testing data under the Weibull distribution with multiple-stress relationship and illustrates the application of the proposed methods in a study involving the development of tumors in mice with respect to risk factors such as sex, strain of offspring, and dose effects of benzidine dihydrochloride. A Monte Carlo simulation study is also carried out to evaluate the performance of the EM estimates for different levels of reliability and different sample sizes. Chapter 5 describes a general algorithm for the determination of the optimal design of an accelerated life-test plan for one-shot device testing experiment. It is based on the asymptotic variance of the estimated reliability at a specific mission time. A numerical example is presented to illustrate the application of the algorithm. Finally, Chapter 6 presents some concluding remarks and some additional research problems that would be of interest for further study.</p> / Doctor of Philosophy (PhD)
|
646 |
STATISTICAL AND METHODOLOGICAL ISSUES ON COVARIATE ADJUSTMENT IN CLINICAL TRIALSChu, Rong 04 1900 (has links)
<p><strong>Background and objectives</strong></p> <p>We investigate three issues related to the adjustment for baseline covariates in late phase clinical trials: (1) the analysis of correlated outcomes in multicentre RCTs, (2) the assessment of the probability and implication of prognostic imbalance in RCTs, and (3) the adjustment for baseline confounding in cohort studies.</p> <p><strong>Methods</strong></p> <p>Project 1: We investigated the properties of six statistical methods for analyzing continuous outcomes in multicentre randomized controlled trials (RCTs) where within-centre clustering was possible. We simulated studies over various intraclass correlation (ICC) values with several centre combinations.</p> <p>Project 2: We simulated data from RCTs evaluating a binary outcome by varying risk of the outcome, effect of the treatment, power and prevalence of a binary prognostic factor (PF), and sample size. We compared logistic regression models with and without adjustment for the PF, in terms of bias, standard error, coverage of confidence interval, and statistical power. A tool to assess sample size requirement to control for chance imbalance was proposed.</p> <p>Project 3: We conducted a prospective cohort study to evaluate the effect of tuberculosis (TB) at the initiation of antiretroviral therapy (ART) on all cause mortality using Cox proportional hazard model on propensity score (PS) matched patients to control for potential confounding. We assessed the robustness of results using sensitivity analyses.</p> <p><strong>Results and conclusions</strong></p> <p>Project 1: All six methods produce unbiased estimates of treatment effect in multicentre trials. Adjusting for centre as a random intercept leads to the most efficient treatment effect estimation, and hence should be used in the presence of clustering.</p> <p>Project 2: The probability of prognostic imbalance in small trials can be substantial. Covariate adjustment improves estimation accuracy and statistical power, and hence should be performed when strong PFs are observed.</p> <p>Project 3: After controlling for the important confounding variables, HIV patients who had TB at the initiation of ART have a moderate increase in the risk of overall mortality.</p> / Doctor of Philosophy (PhD)
|
647 |
Methodological Issues in Design and Analysis of Studies with Correlated Data in Health ResearchMa, Jinhui 04 1900 (has links)
<p>Correlated data with complex association structures arise from longitudinal studies and cluster randomized trials. However, some methodological challenges in the design and analysis of such studies or trials have not been overcome. In this thesis, we address three of the challenges: 1) <em>Power analysis for population based longitudinal study investigating gene-environment interaction effects on chronic disease:</em> For longitudinal studies with interest in investigating the gene-environment interaction in disease susceptibility and progression, rigorous statistical power estimation is crucial to ensure that such studies are scientifically useful and cost-effective since human genome epidemiology is expensive. However conventional sample size calculations for longitudinal study can seriously overestimate the statistical power due to overlooking the measurement error, unmeasured etiological determinants, and competing events that can impede the occurrence of the event of interest. 2) <em>Comparing the performance of different multiple imputation strategies for missing binary outcomes in cluster randomized trials</em>: Though researchers have proposed various strategies to handle missing binary outcome in cluster randomized trials (CRTs), comprehensive guidelines on the selection of the most appropriate or optimal strategy are not available in the literature. 3) <em>Comparison of population-averaged and cluster-specific models for the analysis of cluster randomized trials with missing binary outcome</em>: Both population-averaged and cluster-specific models are commonly used for analyzing binary outcomes in CRTs. However, little attention has been paid to their accuracy and efficiency when analyzing data with missing outcomes. The objective of this thesis is to provide researchers recommendations and guidance for future research in handling the above issues.</p> / Doctor of Philosophy (PhD)
|
648 |
LIKELIHOOD-BASED INFERENTIAL METHODS FOR SOME FLEXIBLE CURE RATE MODELSPal, Suvra 04 1900 (has links)
<p>Recently, the Conway-Maxwell Poisson (COM-Poisson) cure rate model has been proposed which includes as special cases some of the well-known cure rate models discussed in the literature. Data obtained from cancer clinical trials are often right censored and the expectation maximization (EM) algorithm can be efficiently used for the determination of the maximum likelihood estimates (MLEs) of the model parameters based on right censored data.</p> <p>By assuming the lifetime distribution to be exponential, lognormal, Weibull, and gamma, the necessary steps of the EM algorithm are developed for the COM-Poisson cure rate model and some of its special cases. The inferential method is examined by means of an extensive simulation study. Model discrimination within the COM-Poisson family is carried out by likelihood ratio test as well as by information-based criteria. Finally, the proposed method is illustrated with a cutaneous melanoma data on cancer recurrence. As the lifetime distributions considered are not nested, it is not possible to carry out a formal statistical test to determine which among these provides an adequate fit to the data. For this reason, the wider class of generalized gamma distributions is considered which contains all of the above mentioned lifetime distributions as special cases. The steps of the EM algorithm are then developed for this general class of distributions and a simulation study is carried out to evaluate the performance of the proposed estimation method. Model discrimination within the generalized gamma family is carried out by likelihood ratio test and information-based criteria. Finally, for the considered cutaneous melanoma data, the two-way flexibility of the COM-Poisson family and the generalized gamma family is utilized to carry out a two-way model discrimination to select a parsimonious competing cause distribution along with a suitable choice of a lifetime distribution that provides the best fit to the data.</p> / Doctor of Philosophy (PhD)
|
649 |
A Bayesian Approach to Predicting Default, Prepayment and Order Return in Unsecured Consumer Loans / En Bayesiansk metod för estimering av fallissemang, förskottsbetalning, och returnering av order i osäkrade konsumentkrediterKöhler, William January 2023 (has links)
This paper presents an approach to model the risks associated with defaults, prepayments, and order returns in the context of unsecured consumer credits, specifically in buy-now-pay-later (BNPL) loans. The paper presents a Bayesian competing risk proportional hazard model to model the time to default, prepayment, and order return in BNPL loans. Model parameters are estimated using Markov chain Monte Carlo (MCMC) sampling techniques and Bayesian inference is developed using a unique dataset containing monthly performance data of fixed-duration interest-bearing consumer loans. / I denna rapport presenteras en metod för att modellera riskerna förknippade med fallissemang, förskottsbetalning, och returnering av order i osäkrade konsumentkrediter, mer specifikt i köp-nu-betala-senare (BNPL) krediter. Rapporten presenterar en Bayesiansk konkurrerande utfall-modell (competing risk) för att estimera tiden till fallissemang, förskottsbetalning och retur av order i BNPL-lån. Modellens parametrar estimeras med hjälp av Markov chain Monte Carlo (MCMC) metoder och Bayesiansk inferens uppnås med hjälp av ett unikt dataset med månatlig kassaflödesdata från räntebärande BNPL-lån.
|
650 |
Trajectories of Individual Behavior in the US Housing MarketChoi, Seungbee 06 June 2022 (has links)
Three essays in this dissertation explore the behavior of individuals in response to the housing crisis and its consequences, and the impact of the pandemic on the short-term rental markets. The first essay examines the economic outcomes of young people who have returned to their parents' home, using data from 2003-2017 waves of the National Longitudinal Survey of Youth 1997 Cohort (NLSY 97). The economic outcomes of boomerang movers did not improve compared to the period of independent living, and the income gap with young people who remained independent widened. The residential movement of young people who make boomerang moves has an impact on their income, but this effect is short-lived. Going back to a parental house changes the region and urban form significantly, and movement of urban form from the central city to the suburban and from the suburban to out of the MSA has a negative impact on income. Findings from the study suggest implications. First, more affordable housing should be provided to reduce boomerang moves. Second, ways to increase job opportunities should be explored to reduce the short-term negative impact of boomerang move. Finally, education and vocational training opportunities must be increased to close the income gap among young people. The second essay seeks to answer the following questions through the experiences of individual households due to the foreclosure. First, did foreclosed households regain homeownership? Second, is there a relationship between socio-demographic characteristics of foreclosed household and regaining homeownership? Third, where do homeowners who have lost their homes migrate? Finally, what characteristics of the neighborhood help foreclosed households recover? While previous studies have focused on the resilience of housing markets and regions, this study explores the link between regional characteristics and individual household recovery. The recovery of financially disadvantaged households is an important issue for communities and states. Identifying the mechanism that is responsible for household recovery has implications for implementing programs to aid household recovery. This study primarily relies on the 2005 -2019 Panel Study of Income Dynamics (PSID). Since 2009, PSID has added survey questions about foreclosure; Whether a foreclosure process has begun, the year and month of the start, the result of the process, and whether a foreclosed home is a primary residence. The findings of this study suggest that the government's recovery assistance program should aim to support relocation to areas with lower poverty rates and higher job and educational opportunities. The final essay explores changes in short-term rentals resulting from the COVID-19 pandemic. To identify the impact of the COVID-19 pandemic, this study uses New York City's Airbnb listing data from Inside Airbnb (IA), as well as supplemental data such as American Community Survey (ACS) data. Change in the number of STRs is divided into (1) the number of units left the platform and (2) the number of new units. The former relates to the survival of existing STR units and, the latter to the location choice of new units. The results show that the impact of several variables on survival and generation mechanisms changed since the COVID-19 pandemic. Since the survival mechanism and the generation mechanism of short-term rentals are different, they should be considered separately in regulating the STR to stabilize local housing markets. / Doctor of Philosophy / Although research has been conducted on the housing crisis and recovery of the housing market, there are still unanswered questions from two aspects. First, have the individuals affected by the crisis recovered? Were the individual decisions in response to the crisis effective? Second, how has the new crisis caused by the COVID-19 pandemic impacted the housing market? Are different characteristics observed from previous housing crises? While the evidence is reported that the relationship between the new crisis and housing demand has changed, the impact of the pandemic on contemporary housing crises such as gentrification and reduced housing stock is unknown. This dissertation explores the trajectories of individual behavior in the housing market, using various data sources and methodologies. Of the three essays in this dissertation, the first two essays explore the behavior of individuals in response to the housing crisis and its consequences, and the final essay explores the impact of the pandemic on the short-term rental markets.
The first essay investigates the economic outcomes of young people who return to their parental homes after periods of independent living using NLSY97 data. The second essay investigates the relationship between neighborhoods and the economic recovery of households using Panel Study of Income Dynamics. The third essay explores changes in the survival and generation mechanism of Airbnb units associated with the COVID-19 pandemic using New York City's Airbnb listing data. The results of each study commonly lead to the conclusion that housing affordability should be improved. It also suggests that more affordable housing should be provided in areas of greater opportunities. This dissertation ultimately contributes to identifying individuals at risk from external shocks and suggesting goals and strategies for a healthy housing market.
|
Page generated in 0.0607 seconds