• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 249
  • 111
  • 16
  • 15
  • 13
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 1
  • Tagged with
  • 440
  • 127
  • 110
  • 88
  • 74
  • 54
  • 51
  • 43
  • 36
  • 31
  • 31
  • 29
  • 29
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Statistical Methods for Big Data and Their Applications in Biomedical Research

Unknown Date (has links)
Big data has brought both opportunities and challenges to our research community. Complex models can be built with large volumes of data researchers have never had access before. In this study we explore the structure learning of Bayesian network (BN) and its application to reverse engineering of gene regulatory networks (GRNs). A Bayesian network is a graphical representation of a joint distribution that encodes the conditional dependencies and independencies among the variables. We proposed a novel three-stage BN structure learning method, called GRASP (GRowth-based Approach with Staged Pruning). In the first stage, a new skeleton (undirected edges) discovery method, double filtering (DF), was designed. Compared to existing methods, DF requires smaller sample sizes to achieve similar statistical power. Based on the skeleton estimated in the first step, we proposed a sequential Monte Carlo (SMC) method to sample the edges and their directions to optimize a BIC-based score. SMC method has less tendency to be trapped in local optima, and the computation is easily parallelizable. On the third stage, we reclaim the edges that may be missed from previous stages. We obtained satisfactory results from simulation study and applied the method to infer GRNs from real experimental data. A method on personalized chemotherapy regimen selection for breast cancer and a novel algorithm for relationship extraction from unstructured documents will be discussed as well. / A Dissertation submitted to the Department of Statistics in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Spring Semester 2016. / March 22, 2016. / Bayesian network structure learning, Neoadjuvent chemotherapy, Protein-protein-interaction, sequential Monte Carlo / Includes bibliographical references. / Jinfeng Zhang, Professor Directing Dissertation; Qing-Xiang Amy Sang, University Representative; Adrian Barbu, Committee Member; Yiyuan She, Committee Member; Debajyoti Sinha, Committee Member.
72

Survival Analysis Using Bayesian Joint Models

Unknown Date (has links)
In many clinical studies, each patient is at risk of recurrent events as well as the terminating event. In Chapter 2, we present a novel latent-class based semiparametric joint model that offers clinically meaningful and estimable association between the recurrence profile and risk of termination. Unlike previous shared-frailty based joint models, this model has a coherent interpretation of the covariate effects on all relevant functions and model quantities that are either conditional or unconditional on events history. We offer a fully Bayesian method for estimation and prediction using a complete specification of the prior process of the baseline functions. When there is a lack of prior information about the baseline functions, we derive a practical and theoretically justifiable partial likelihood based semiparametric Bayesian approach. Our Markov Chain Monte Carlo tools for both Bayesian methods are implementable via publicly available software. Practical advantages of our methods are illustrated via a simulation study and the analysis of a transplant study with recurrent Non-Fatal Graft Rejections (NFGR) and the termination event of death due to total graft rejection. In Chapter 3, we are motivated by the important problem of estimating Daily Fine Particulate Matter (PM2.5) over the US. Tracking and estimating Daily Fine Particulate Matter (PM2.5) is very important as it has been shown that PM2.5 is directly related to mortality related to the lungs, cardiovascular system, and stroke. That is, high values of PM2.5 constitute a public health problem in the US, and it is important that we precisely estimate PM2.5 to aid in public policy decisions. Thus, we propose a Bayesian hierarchical model for high-dimensional ``multi-type" responses. By ``multi-type" responses we mean a collection of correlated responses that have different distributional assumptions (e.g., continuous skewed observations, and count-valued observations). The Centers for Disease Control and Prevention (CDC) database provides counts of mortalities related to PM2.5 and daily averaged PM2.5 which are treated as responses in our analysis. Our model capitalizes on the shared conjugate structure between the Weibull (to model PM2.5), Poisson (to model diseases mortalities), and multivariate log-gamma distributions, and use dimension reduction to aid with computation. Our model can also be used to improve the precision of estimates and estimate at undisclosed/missing counties. We provide a simulation study to illustrate the performance of the model and give an in-depth analysis of the CDC dataset. / A Dissertation submitted to the Department of Statistics in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Spring Semester 2019. / April 19, 2019. / Bayesian analysis, Frality, Joint Model, Multi-type responses, Recurrent events, Spatial survival analysis / Includes bibliographical references. / Debajyoti Sinha, Professor Directing Dissertation; Chris Schatschneider, University Representative; Jonathan R. Bradley, Committee Member; Eric Chicken, Committee Member; Lifeng Lin, Committee Member.
73

Analysis Approaches for Wearable Device Data

Hilden, Patrick January 2021 (has links)
Wearable devices, which track a subject’s activity (e.g. steps, calories, intensity) over time, have become a popular option for research studies which seek to better understand an individual’s physical activity in the day-to-day setting. This thesis looks to address three common problems within the wearable device setting; how to address missing data and incomplete wear time, what to do when large outlying values are present, and how many observation days are required to reasonably estimate various activity metrics of interest. Given the dense nature of observations from such devices, functional data analysis (FDA) provides a natural framework for analysis, and we seek to address the first problem related to missing data by leveraging generalized functional principal components analysis (GFPCA). In addressing the second problem related to outlying values, we leverage both FDA and the novel principal component pursuit (PCP) approach, which has seen limited application within the field, to separate on observed functional value into low-rank, sparse, and error component functions. Finally, using a rich longitudinal data set, we provide insight into the third problem regarding what is an appropriate study length, utilizing the framework of measurement reliability which has been often applied in the activity data setting. Our results suggest that leveraging FDA methods can provide more accurate estimates of activity during periods of nonwear then current approaches, and that in the presence of large outliers more robust estimates of underlying activity and outlier presence can be determined by combining FDA methods and those of PCP. Finally, within our longitudinal cohort we show that current guidelines regarding the number of days necessary to achieve a reasonable measurement reliability are inaccurate, and often underestimate the true number of days required.
74

Optimal Treatment Regimes for Personalized Medicine and Mobile Health

Oh, Eun Jeong January 2020 (has links)
There has been increasing development in personalized interventions that are tailored to uniquely evolving health status of each patient over time. In this dissertation, we investigate two problems: (1) the construction of individualized mobile health (mHealth) application recommender system; and (2) the estimation of optimal dynamic treatment regimes (DTRs) from a multi-stage clinical trial study. The dissertation is organized as follows. In Chapter 1, we provide a brief background on personalized medicine and two motivating examples which illustrate the needs and benefits of individualized treatment policies. We then introduce reinforcement learning and various methods to obtain the optimal DTRs as well as Q-learning procedure which is a popular method in the DTR literature. In Chapter 2, we propose a partial regularization via orthogonality using the adaptive Lasso (PRO-aLasso) to estimate the optimal policy which maximizes the expected utility in the mHealth setting. We also derive the convergence rate of the expected outcome of the estimated policy to that of the true optimal policy. The PRO-aLasso estimators are shown to enjoy the same oracle properties as the adaptive Lasso. Simulations and real data application demonstrate that the PRO-aLasso yields simple, more stable policies with better results as compared to the adaptive Lasso and other competing methods. In Chapter 3, we propose a penalized A-learning with a Lasso-type penalty for the construction of optimal DTR and derive generalization error bounds of the estimated DTR. We first examine the relationship between value and the Q-functions, and then we provide a finite sample upper bound on the difference in values between the optimal DTR and the estimated DTR. In practice, we implement a multi-stage PRO-aLasso algorithm to obtain the optimal DTR. Simulation results show advantages of the proposed methods over some existing alternatives. The proposed approach is also demonstrated with the data from a depression clinical trial study. In Chapter 4, we present future work and concluding remarks.
75

Space-time clustering : finding the distribution of a correlation-type statistic.

Siemiatycki, Jack January 1971 (has links)
No description available.
76

Confidence intervals for inverse regression with applications to blood hormone analysis

David, Richard. January 1974 (has links)
No description available.
77

Face recognition with Eigenfaces : a detailed study.

Vawda, Nadeem. January 2012 (has links)
With human society becoming increasingly computerised, the use of biometrics to automatically establish the identity of an individual is of great interest in a wide variety of applications. Facial appearance is an appealing biometric, on account of its relatively non-intrusive nature. As such, automated face recognition systems have been the subject of much research in recent years. This dissertation describes the development of a fully automatic face recognition system, and provides an analysis of its performance under various di erent operating conditions, in comparison with results published in prior literature. In addition to giving a detailed description of the mathematical underpinnings of the techniques used by the system, we discuss the practical considerations involved in implementing the described techniques. The system presented here uses the eigenface approach to representing facial features. A number of di erent recognition techniques have been implemented and evaluated. These include a number of variants of the original eigenface technique proposed by Turk and Pentland, as well as a related technique based on the probabilistic approach of Moghaddam et al. Due to the wide range of datasets used to evaluate face recognition systems in the literature, it is di cult to reliably compare the performance of di erent systems. The system described here has been tested with datasets encompassing a wide range of di erent conditions, allowing us to draw conclusions about how the characteristics of the test data a ect the results that are obtained. The performance of this system is comparable to other eigenface-based systems documented in the literature, achieving success rates in the region of 85% for large datasets under controlled conditions. However, performance was observed to degrade signi cantly when testing with more free-form images; in particular, the e ects of ageing on facial appearance were noted to cause problems for the system. This suggests that the matter of ageing is still a fruitful direction for further research. / Thesis (M.Sc.)-University of KwaZulu-Natal, Westville, 2012.
78

Proportional odds model for survival data

梁翠蓮, Leung, Tsui-lin. January 1999 (has links)
published_or_final_version / Statistics / Master / Master of Philosophy
79

Ocular biometric correlates of early-and late-onset myopia

Harper, Justine January 2001 (has links)
Myopia is a refractive condition and develops because either the optical power of the eye is abnormally great or the eye is abnormally long, the optical consequences being that the focal length of the eye is too short for the physical length of the eye. The increase in axial length has been shown to match closely the dioptric error of the eye, in that a lmm increase in axial length usually generates 2 to 3D of myopia. The most common form of myopia is early-onset myopia (EO M) which occurs between 6 to 14 years of age. The second most common form of myopia is late-onset myopia (LOM) which emerges in late teens or early twenties, at a time when the eye should have ceased growing. The prevalence of LOM is increasing and research has indicated a link with excessive and sustained nearwork. The aim of this thesis was to examine the ocular biometric correlates associated with LOM and EOM development and progression. Biometric data was recorded on SO subjects, aged 16 to 26 years. The group was divided into 26 emmetropic subjects and 24 myopic subjects. Keratometry, corneal topography, ultrasonography, lens shape, central and peripheral refractive error, ocular blood flow and assessment of accommodation were measured on three occasions during an ISmonth to 2-year longitudinal study. Retinal contours were derived using a specially derived computer program. The thesis shows that myopia progression is related to an increase in vitreous chamber depth, a finding which supports previous work. The myopes exhibited hyperopic relative peripheral refractive error (PRE) and the emmetropes exhibited myopic relative PRE. Myopes demonstrated a prolate retinal shape and the retina became more prolate with myopia progression. The results show that a longitudinal, rather than equatorial, increase in the posterior segment is the principal structural correlate of myopia. Retinal shape, relative PRE and the ratio of axial length to corneal curvature have been indicated, in this thesis, as predictive factors for myopia onset and development. Data from this thesis demonstrates that myopia progression in the LOM group is the result of an increase in anterior segment power, owing to an increase in lens thickness, in conjunction with posterior segment elongation. Myopia progression in the EOM group is the product of a long posterior segment, which over-compensates for a weak anterior segment power. The weak anterior segment power in the EOM group is related to a combination of crystalline lens thinning and surface flattening. The results presented in this thesis confirm that posterior segment elongation is the main structural correlate in both EOM and LOM progression. The techniques and computer programs employed in the thesis are reproducible and robust providing a valuable framework for further myopia research and assessment of predictive factors.
80

Investigating and comparing multimodal biometric techniques

19 May 2009 (has links)
M.Sc. / Determining the identity of a person has become vital in today’s world. Emphasis on security has become increasingly more common in the last few decades, not only in Information Technology, but across all industries. One of the main principles of security is that a system only be accessed by a legitimate user. According to the ISO 7498/2 document [1] (an international standard which defines an information security system architecture) there are 5 pillars of information security. These are Identification/Authentication, Confidentiality, Authorization, Integrity and Non Repudiation. The very first line of security in a system is identifying and authenticating a user. This ensures that the user is who he/she claims to be, and allows only authorized individuals to access your system. Technologies have been developed that can automatically recognize a person by his unique physical features. This technology, referred to as ‘biometrics’, allows us to quickly, securely and conveniently identify an individual. Biometrics solutions have already been deployed worldwide, and it is rapidly becoming an acceptable method of identification in the eye of the public. As useful and advanced as unimodal (single biometric sample) biometric technologies are, they have their limits. Some of them aren’t completely accurate; others aren’t as secure and can be easily bypassed. Recently it has been reported to the congress of the U.S.A [2] that about 2 percent of the population in their country do not have a clear enough fingerprint for biometric use, and therefore cannot use their fingerprints for enrollment or verification. This same report recommends using a biometric system with dual (multimodal) biometric inputs, especially for large scale systems, such as airports. In this dissertation we will investigate and compare multimodal biometric techniques, in order to determine how much of an advantage lies in using this technology, over its unimodal equivalent.

Page generated in 0.4169 seconds