21 |
Safety Analyses At Signalized Intersections Considering Spatial, Temporal And Site CorrelationWang, Xuesong 01 January 2006 (has links)
Statistics show that signalized intersections are among the most dangerous locations of a roadway network. Different approaches including crash frequency and severity models have been used to establish the relationship between crash occurrence and intersection characteristics. In order to model crash occurrence at signalized intersections more efficiently and eventually to better identify the significant factors contributing to crashes, this dissertation investigated the temporal, spatial, and site correlations for total, rear-end, right-angle and left-turn crashes. Using the basic regression model for correlated crash data leads to invalid statistical inference, due to incorrect test statistics and standard errors based on the misspecified variance. In this dissertation, the Generalized Estimating Equations (GEEs) were applied, which provide an extension of generalized linear models to the analysis of longitudinal or clustered data. A series of frequency models are presented by using the GEE with a Negative Binomial as the link function. The GEE models for the crash frequency per year (using four correlation structures) were fitted for longitudinal data; the GEE models for the crash frequency per intersection (using three correlation structures) were fitted for the signalized intersections along corridors; the GEE models were applied for the rear-end crash data with temporal or spatial correlation separately. For right-angle crash frequency, models at intersection, roadway, and approach levels were fitted and the roadway and approach level models were estimated by using the GEE to account for the "site correlation"; and for left-turn crashes, the approach level crash frequencies were modeled by using the GEE with a Negative Binomial link function for most patterns and using a binomial logit link function for the pattern having a higher proportion of zeros and ones in crash frequencies. All intersection geometry design features, traffic control and operational features, traffic flows, and crashes were obtained for selected intersections. Massive data collection work has been done. The autoregression structure is found to be the most appropriate correlation structure for both intersection temporal and spatial analyses, which indicates that the correlation between the multiple observations for a certain intersection will decrease as the time-gap increase and for spatially correlated signalized intersections along corridors the correlation between intersections decreases as spacing increases. The unstructured correlation structure was applied for roadway and approach level right-angle crashes and also for different patterns of left-turn crashes at the approach level. Usually two approaches at the same roadway have a higher correlation. At signalized intersections, differences exist in traffic volumes, site geometry, and signal operations, as well as safety performance on various approaches of intersections. Therefore, modeling the total number of left-turn crashes at intersections may obscure the real relationship between the crash causes and their effects. The dissertation modeled crashes at different levels. Particularly, intersection, roadway, and approach level models were compared for right-angle crashes, and different crash assignment criteria of "at-fault driver" or "near-side" were applied for disaggregated models. It shows that for the roadway and approach level models, the "near-side" models outperformed the "at-fault driver" models. Variables in traffic characteristics, geometric design features, traffic control and operational features, corridor level factor, and location type have been identified to be significant in crash occurrence. In specific, the safety relationship between crash occurrence and traffic volume has been investigated extensively at different studies. It has been found that the logarithm of traffic volumes per lane for the entire intersection is the best functional form for the total crashes in both the temporal and spatial analyses. The studies of right-angle and left-turn crashes confirm the assumption that the frequency of collisions is related to the traffic flows to which the colliding vehicles belong and not to the sum of the entering flows; the logarithm of the product of conflicting flows is usually the most significant functional form in the model. This study found that the left-turn protection on the minor roadway will increase rear-end crash occurrence, while the left-turn protection on the major roadway will reduce rear-end crashes. In addition, left-turn protection reduces Pattern 5 left-turn crashes (left-turning traffic collides with on-coming through traffic) specifically, but it increases Pattern 8 left-turn crashes (left-turning traffic collides with near-side crossing through traffic), and it has no significant effect on other patterns of left-turn crashes. This dissertation also investigated some other factors which have not been considered before. The safety effectiveness of many variables identified in this dissertation is consistent with previous studies. Some variables have unexpected signs and a justification is provided. Injury severity also has been studied for Patterns 5 left-turn crashes. Crashes were located to the approach with left-turning vehicles. The "site correlation" among the crashes occurred at the same approach was considered since these crashes may have similar propensity in crash severity. Many methodologies and applications have been attempted in this dissertation. Therefore, the study has both theoretical and implementational contribution in safety analysis at signalized intersections.
|
22 |
Postnatal depression (PND) and neighborhood effects for women enrolled in a home visitation programJones, David 03 June 2016 (has links)
No description available.
|
23 |
Multiple imputation for marginal and mixed models in longitudinal data with informative missingnessDeng, Wei 07 October 2005 (has links)
No description available.
|
24 |
Predicting the occurrence of major adverse cardiac events within 30 days after a patient’s vascular surgery: An individual patient-data meta-analysisVanniyasingam, Thuvaraha 04 1900 (has links)
<p><strong>Background:</strong> Major adverse cardiac events, MACE – a composite endpoint of cardiac death and nonfatal myocardial infarction (MI) – are severe harmful outcomes that commonly arise after elective vascular surgeries. As current pre-operative risk prediction models are not as effective in predicting post-operative outcomes, this thesis will discuss the key results of an individual patient data meta-analysis that is based on data from six cohort studies of patients undergoing vascular surgery.</p> <p><strong>Objectives:</strong> The purpose of this thesis is to determine optimal thresholds of continuous covariates and create a prediction model for major adverse cardiac events (MACE), within 30 days after a vascular surgery. The goals include exploring the minimum p-value method to dichotomize cutpoints for continuous variables; employing logistic regression analysis to determine a prediction model for MACE; evaluating its validity against other samples; and assessing its sensitivity to clustering effects. The secondary objectives are to determine individual models for predicting all-cause mortality, cardiac death, and nonfatal MI within 30 days of a vascular surgery, using the final covariates assessed for MACE.<strong></strong></p> <p><strong>Methods: </strong>Both B-type naturietic peptide (BNP) and its N-terminal fragment (NTproBNP) are independently associated with cardiovascular complications after noncardiac surgeries, and particularly frequent after noncardiac vascular surgeries. In a previous study, these covariates were dichotomized using the receiver operating characteristic (ROC) curve approach and a simple logistic regression (SLR) model was created for MACE [1]. The first part of this thesis applies the minimum p-value method to determine a threshold for each natriuretic peptide (NP), BNP and NTproBNP. SLR is then used to model the prediction of MACE within 30 days after a patient’s vascular surgery. Comparisons were made with the ROC curve approach to determine the optimal thresholds and create a prediction model. The validity of this model was tested using bootstrap samples and its robustness was assessed using a mixed effects logistic regression (MELR) model and a generalized estimating equation (GEE). Finally, MELR was performed on each of the secondary outcomes.</p> <p><strong>Results:</strong>A variable, ROC_thrshld, was created to represent the cutpoints of Rodseth’s ROC curve approach, which identified 116pg/mL and 277.5pg/mL to be the optimal thresholds for BNP and NTproBNP, respectively [1]. The minimum p-value method dichotomized these NP thresholds as BNP: 115.57pg/mL (p</p> <p><strong>Discussion:</strong> One key limitation to this thesis is the small sample size received for NTproBNP. Also, determining only one cutpoint for each NP concentration may not be sufficient, since dichotomizing continuous factors can lead to loss of information along with other issues. Further research should be performed to explore other possible cutpoints along with performing reclassification to observe improvements in risk stratification. After validating our final model against other samples, we can conclude that MINP_thrshld, the type of surgery, and diabetes are significant covariates for the prediction of MACE. With the simplicity in only requiring a blood test to measure NP concentration levels and easily learning the status of the other two factors, minimal effort is needed in calculating the points and risk estimates for each patient. Further research should also be performed on the secondary outcomes to examine other factors that may be useful in prediction.</p> <p><strong>Conclusions: </strong>The minimum p-value method produced similar results to the ROC curve method in dichotomizing the NP concentration levels. The cutpoints for BNP and NTproBNP were 115.57pg/mL and 241.7 pg/mL, respectively. Further research needs to be performed to determine the optimality of the final prediction model of MACE, with covariates MINP_thrshld, type of surgery, and diabetes mellitus. <strong></strong></p> <p><strong><br /></strong></p> / Master of Science (MSc)
|
25 |
Continuation Ratio and Generalized Estimating Equation Analysis of a Longitudinal Asthma Study / Statistical Analysis of a Longitudinal Asthma StudyCapan, Dragos 04 1900 (has links)
Two randomized controlled trials were conducted to find out whether a new treatment for asthma has a significant effect on the patients. These were multi-center trials with a parallel design, the control arm receiving a Placebo. The data were collected over a period of about 20 days before administering the intervention and for almost 80 days after the intervention. Thus, each patient has many observations recorded, making the data longitudinal. The data are summarized using first descriptive statistics and graphical displays. Then, a continuation ratio model with a lagged covariate to account for the longitudinal aspect is used to model the data. Finally, Generalized Estimating Equations methods are used. These methods have acquired popularity in recent years to account for longitudinal correlation structures. To apply the continuation ratio, the data have to be appropriately restructured. Then, the logistic regression is used to model the symptoms. The results of this procedure show that the treatment is statistically significant. However, the goodness of fit tests show that the model is inadequate. This issue is explored in the last subsection of Chapter 3. Using Generalized Estimating Equations to analyze the number of times rescue medication was used, we concluded that there is no statistically significant difference between the Active and Control groups. However, we noticed that the use of rescue medication decreased with time from the start of treatment. / Thesis / Master of Science (MS)
|
26 |
Predictors of Carbapenem Resistant Gram-negative Bacteria in a Consortium of Academic Medical Center HospitalsAbabneh, Mera 01 January 2012 (has links)
Background: Gram-negative resistance is a growing problem worldwide. It is generally believed that rates of resistant bacteria within a hospital are a function of antibiotic use, resistant organisms brought into the hospital, infection control efforts, and underlying severity of patient illness. The relative contribution of each to a particular resistance phenotype is unclear. P. aeruginosa is responsible for many hospital acquired infections and it may become resistant to carbapenems. In addition, newer threats to the future utility of the carbapenems are carbapenemase-producing K. pneumoniae Purpose: To determine if there is an association between the volume and composition of antibiotic use, geography, severity of illness and rates of carbapenem-resistant P. aeruginosa and K. pneumoniae. Methods: This is a retrospective ecological longitudinal investigation within the University HealthSystem Consortium affiliated academic medical centers. Antibiotic use data between January 1, 2006 and December 31, 2009 were obtained from billing records and reported as days of therapy per 1000 patient days (DOT/1000 PD), in addition to hospital characteristics (e.g. geographical location, bed size, case mix index). “Whole house” antibiograms were obtained to determine rates and proportions of carbapenem-resistant P. aeruginosa (CR-PA) and carbapenem resistant K. pneumoniae (CR-KP). Also, CR-KP isolation was generated as a binary outcome. Generalized estimating equations (GEE) were used to model CR-KP and CR-PA. Results: CR-KP rates (1000PDs) increased from 0.07 in 2006 to 0.15 in 2009 (P= 0.0118) and CR-KP proportions increased from 1.3% in 2006 to 3.1% in 2009 (0.0003) within 40 hospitals over 2006-2009. However, CR-PA rates and proportions were stable over the same period. Geographical location, carbapenems use, and antipseudomonal penicillins use were significantly associated with CR-KP isolation. Thus, for every ten DOT/1000 PDs increase in carbapenem use, the odds of CR-KP isolation increased by 42% (P=0.0149). In contrast, for every ten DOT/1000 PDs increase in antipseudomonal penicillin use, the odds of CR-KP isolation decreased by 14%. However, there was no significant model to explain CR-PA rates and proportions. Conclusion: Carbapenems, antipseudomonal penicillins, and geographical location were identified as risk factors associated with CR-KP isolation. These findings emphasize the challenges associated with the treatment of multidrug- gram-negative bacteria.
|
27 |
Modelování závislostí v rezervování škod / Modeling dependencies in claims reservingKaderjáková, Zuzana January 2014 (has links)
The generalized linear models (GLM) lately received a lot of attention in modelling the insurance data. However, the violation of assumptions about the independence of underlying data set often causes problems and misinterpretation of achieved results. The need for more exible instruments has been spoken out and consequently various proposals have been made. This thesis deals with GLM based techniques enabling to handle correlated data sets. The usage have been made of generalized linear mixed models (GLMM) and generalized estimating equations (GEE). The main aim of this thesis is to provide a solid statistical background and perform a practical application to demonstrate and compare features of various models. Powered by TCPDF (www.tcpdf.org)
|
28 |
Consensus Segmentation for Positron Emission Tomography: Development and Applications in Radiation TherapyMcGurk, Ross January 2013 (has links)
<p>The use of positron emission tomography (PET) in radiation therapy has continued to grow, especially since the development of combined computed tomography (CT) and PET imaging system in the early 1990s. Today, the biggest use of PET-CT is in oncology, where a glucose analog radiotracer is rapidly incorporated into the metabolic pathways of a variety of cancers. Images representing the in-vivo distribution of this radiotracer are used for the staging, delineation and assessment of treatment response of patients undergoing chemotherapy or radiation therapy. While PET offers the ability to provide functional information, the imaging quality of PET is adversely affected by its lower spatial resolution. It also has unfavorable image noise characteristics due to radiation dose concerns and patient compliance. These factors result in PET images having less detail and lower signal-to-noise (SNR) properties compared to images produced by CT. This complicates the use of PET within many areas of radiation oncology, but particularly the delineation of targets for radiation therapy and the assessment of patient response to therapy. The development of segmentation methods that can provide accurate object identification in PET images under a variety of imaging conditions has been a goal of the imaging community for years. The goal of this thesis are to: (1) investigate the effect of filtering on segmentation methods; (2) investigate whether combining individual segmentation methods can improve segmentation accuracy; (3) investigate whether the consensus volumes can be useful in aiding physicians of different experience in defining gross tumor volumes (GTV) for head-and-neck cancer patients; and (4) to investigate whether consensus volumes can be useful in assessing early treatment response in head-and-neck cancer patients.</p><p>For this dissertation work, standard spherical objects of volumes ranging from 1.15 cc to 37 cc and two irregularly shaped objects of volume 16 cc and 32 cc formed by deforming high density plastic bottles were placed in a standardized image quality phantom and imaged at two contrasts (4:1 or 8:1 for spheres, and 4.5:1 and 9:1 for irregular) and three scan durations (1, 2 and 5 minutes). For the work carried out into the comparison of images filters, Gaussian and bilateral filters matched to produce similar image signal to noise (SNR) in background regions were applied to raw unfiltered images. Objects were segmented using thresholding at 40% of the maximum intensity within a region-of-interest (ROI), an adaptive thresholding method which accounts for the signal of the object as well as background, k-means clustering, and a seeded region-growing method adapted from the literature. Quality of the segmentations was assessed using the Dice Similarity Coefficient (DSC) and symmetric mean absolute surface distance (SMASD). Further, models describing how DSC varies with object size, contrast, scan duration, filter choice and segmentation method were fitted using generalized estimating equations (GEEs) and standard regression for comparison. GEEs accounted for the bounded, correlated and heteroscedastic nature of the DSC metric. Our analysis revealed that object size had the largest effect on DSC for spheres, followed by contrast and scan duration. In addition, compared to filtering images with a 5 mm full-width at half maximum (FWHM) Gaussian filter, a 7 mm bilateral filter with moderate pre-smoothing (3 mm Gaussian (G3B7)) produced significant improvements in 3 out of the 4 segmentation methods for spheres. For the irregular objects, time had the biggest effect on DSC values, followed by contrast. </p><p>For the study of applying consensus methods to PET segmentation, an additional gradient based method was included into the collection individual segmentation methods used for the filtering study. Objects in images acquired for 5 minute scan durations were filtered with a 5 mm FWHM Gaussian before being segmented by all individual methods. Two approaches of creating a volume reflecting the agreement between the individual methods were investigated. First, a simple majority voting scheme (MJV), where individual voxels segmented by three or more of the individual methods are included in the consensus volume, and second, the Simultaneous Truth and Performance Level Estimation (STAPLE) method which is a maximum likelihood methodology previously presented in the literature but never applied to PET segmentation. Improvements in accuracy to match or exceed the best performing individual method were observed, and importantly, both consensus methods provided robustness against poorly performing individual methods. In fact, the distributions of DSC and SMASD values for the MJV and STAPLE closely match the distribution that would result if the best individual method result were selected for all objects (the best individual method varies by objects). Given that the best individual method is dependent on object type, size, contrast, and image noise and the best individual method is not able to be known before segmentation, consensus methods offer a marked improvement over the current standard of using just one of the individual segmentation methods used in this dissertation. </p><p>To explore the potential application of consensus volumes to radiation therapy, the MJV consensus method was used to produce GTVs in a population of head and neck cancer patients. This GTV and one created using simple 40% thresholding were then available to be used as a guidance volume for an attending head and neck radiation oncologist and a resident who had completed their head and neck rotation. The task for each physician was to manually delineate GTVs using the CT and PET images. Each patient was contoured three times by each physician- without guidance and with guidance using either the MJV consensus volume or 40% thresholding. Differences in GTV volumes between physicians were not significant, nor were differences between the GTV volumes regardless of the guidance volume available to the physicians. However, on average, 15-20% of the provided guidance volume lay outside the final physician-defined contour.</p><p>In the final study, the MJV and STAPLE consensus volumes were used to extract maximum, peak and mean SUV measurements in two baseline PET scans and one PET scan taken during patients' prescribed radiation therapy treatments. Mean SUV values derived from consensus volumes showed smaller variability compared to maximum SUV values. Baseline and intratreatment variability was assessed using a Bland-Altman analysis which showed that baseline variability in SUV was lower than intratreatment changes in SUV.</p><p>The techniques developed and reported in this thesis demonstrate how filter choice affects segmentation accuracy, how the use of GEEs more appropriately account for the properties of a common segmentation quality metric, and how consensus volumes not only provide an accuracy on par with the single best performing individual method in a given activity distribution, but also exhibit a robustness against variable performance of individual segmentation methods that make up the consensus volume. These properties make the use of consensus volumes appealing for a variety of tasks in radiation oncology.</p> / Dissertation
|
29 |
Robust Methods for Interval-Censored Life History DataTolusso, David January 2008 (has links)
Interval censoring arises frequently in life history data, as individuals are
often only observed at a sequence of assessment times. This leads to a
situation where we do not know when an event of interest occurs, only that it
occurred somewhere between two assessment times. Here, the focus will be on
methods of estimation for recurrent event data, current status data, and
multistate data, subject to interval censoring.
With recurrent event data, the focus is often on estimating the rate and mean
functions. Nonparametric estimates are readily available, but are not smooth.
Methods based on local likelihood and the assumption of a Poisson process are
developed to obtain smooth estimates of the rate and mean functions without
specifying a parametric form. Covariates and extra-Poisson variation are
accommodated by using a pseudo-profile local likelihood. The methods are
assessed by simulations and applied to a number of datasets, including data
from a psoriatic arthritis clinic.
Current status data is an extreme form of interval censoring that occurs when
each individual is observed at only one assessment time. If current status
data arise in clusters, this must be taken into account in order to obtain
valid conclusions. Copulas offer a convenient framework for modelling the
association separately from the margins. Estimating equations are developed
for estimating marginal parameters as well as association parameters.
Efficiency and robustness to the choice of copula are examined for first and
second order estimating equations. The methods are applied to data from an
orthopedic surgery study as well as data on joint damage in psoriatic
arthritis.
Multistate models can be used to characterize the progression of a disease as
individuals move through different states. Considerable attention is given
to a three-state model to characterize the development of a back condition
known as spondylitis in psoriatic arthritis, along with the associated
risk of mortality. Robust estimates of the state occupancy probabilities are
derived based on a difference in distribution functions of the entry times.
A five-state model which differentiates between left-side and right-side
spondylitis is also considered, which allows us to characterize what effect
spondylitis on one side of the body has on the development of
spondylitis on the other side. Covariate effects are considered through
multiplicative time homogeneous Markov models. The robust state occupancy
probabilities are also applied to data on CMV infection in patients with HIV.
|
30 |
Robust Methods for Interval-Censored Life History DataTolusso, David January 2008 (has links)
Interval censoring arises frequently in life history data, as individuals are
often only observed at a sequence of assessment times. This leads to a
situation where we do not know when an event of interest occurs, only that it
occurred somewhere between two assessment times. Here, the focus will be on
methods of estimation for recurrent event data, current status data, and
multistate data, subject to interval censoring.
With recurrent event data, the focus is often on estimating the rate and mean
functions. Nonparametric estimates are readily available, but are not smooth.
Methods based on local likelihood and the assumption of a Poisson process are
developed to obtain smooth estimates of the rate and mean functions without
specifying a parametric form. Covariates and extra-Poisson variation are
accommodated by using a pseudo-profile local likelihood. The methods are
assessed by simulations and applied to a number of datasets, including data
from a psoriatic arthritis clinic.
Current status data is an extreme form of interval censoring that occurs when
each individual is observed at only one assessment time. If current status
data arise in clusters, this must be taken into account in order to obtain
valid conclusions. Copulas offer a convenient framework for modelling the
association separately from the margins. Estimating equations are developed
for estimating marginal parameters as well as association parameters.
Efficiency and robustness to the choice of copula are examined for first and
second order estimating equations. The methods are applied to data from an
orthopedic surgery study as well as data on joint damage in psoriatic
arthritis.
Multistate models can be used to characterize the progression of a disease as
individuals move through different states. Considerable attention is given
to a three-state model to characterize the development of a back condition
known as spondylitis in psoriatic arthritis, along with the associated
risk of mortality. Robust estimates of the state occupancy probabilities are
derived based on a difference in distribution functions of the entry times.
A five-state model which differentiates between left-side and right-side
spondylitis is also considered, which allows us to characterize what effect
spondylitis on one side of the body has on the development of
spondylitis on the other side. Covariate effects are considered through
multiplicative time homogeneous Markov models. The robust state occupancy
probabilities are also applied to data on CMV infection in patients with HIV.
|
Page generated in 0.1768 seconds