521 |
Theory of dipole interaction in crystalsJanuary 1946 (has links)
[by] J.M. Luttinger and L. Tisza. / "Reprinted from the Physical review, vol. 70, nos. 11 and 12, 954-964, Dec. 1 and 15, 1946." / Includes bibliographical references. / Contract OEMsr-262.
|
522 |
Precipitation estimation in mountainous terrain using multivariate geostatisticsHevesi, Joseph A. 22 May 1990 (has links)
Estimates of average annual precipitation (AAP) are-needed for hydrologic modeling at
Yucca Mtn., Nevada, site of a proposed, high-level nuclear waste repository. Historical
precipitation data and station elevation were obtained for stations in southern Nevada and
southeastern California. Elevations for 1,531 additional locations were obtained from
topographic maps. The sample direct-variogram for the transformed variable TAAP =
ln(AAP) * 1000 was fit with an isotropic, spherical model with a small nugget and a range
of 190,000 ft. The sample direct-variogram for elevation was fit with an isotropic model
with four nested structures (nugget, Gaussian, spherical, and linear) with ranges between 0
and 270,000 ft. There was a significant (p = 0.05, r = 0.75) linear correlation between
TAAP and station elevation. The sample cross-variogram for TAAP and elevation was fit
with two nested structures (Gaussian, spherical) with ranges from 55,000 to 355,000 ft.
Alternate model structures and parameters were compared using cross-validation.
Isohyetal maps for average annual precipitation (AAP) were prepared from
estimates obtained by kriging and cokriging using the selected models. Isohyets based on
the kriging estimates were very smooth, increasing gradually from the southwest to the
northeast. Isohyets based on the cokriging estimates and the spatial correlation between
AAP and elevation were more irregular and displayed known orographic effects. Indirect
confirmation of the cokriging estimates were obtained by comparing isohyets prepared with
the cokriging estimates to the boundaries of more densely vegetated and/or forested zones.
Estimates for AAP at the repository site were 145 and 165 mm for kriging and cokriging,
respectively. Cokriging reduced estimation variances at the repository site by 55% relative
to kriging. The effectiveness of an existing network of stations for measuring AAP is
evaluated and recommendations are made for optimal locations for additional stations. / Graduation date: 1991
|
523 |
Classification models for disease diagnosis and outcome analysisWu, Tsung-Lin 12 July 2011 (has links)
In this dissertation we study the feature selection and classification problems and apply our methods to real-world medical and biological data sets for disease diagnosis.
Classification is an important problem in disease diagnosis to distinguish patients from normal population. DAMIP (discriminant analysis -- mixed integer program) was shown to be a good classification model, which can directly handle multigroup problems, enforce misclassification limits, and provide reserved judgement region. However, DAMIP is NP-hard and presents computational challenges. Feature selection is important in classification to improve the prediction performance, prevent over-fitting, or facilitate data understanding. However, this combinatorial problem becomes intractable when the number of features is large.
In this dissertation, we propose a modified particle swarm optimization (PSO), a heuristic method, to solve the feature selection problem, and we study its parameter selection in our applications. We derive theories and exact algorithms to solve the two-group DAMIP in polynomial time. We also propose a heuristic algorithm to solve the multigroup DAMIP. Computational studies on simulated data and data from UCI machine learning repository show that the proposed algorithm performs very well. The polynomial solution time of the heuristic method allows us to solve DAMIP repeatedly within the feature selection procedure.
We apply the PSO/DAMIP classification framework to several real-life medical and biological prediction problems. (1) Alzheimer's disease: We use data from several neuropsychological tests to discriminate subjects of Alzheimer's disease, subjects of mild cognitive impairment, and control groups. (2) Cardiovascular disease: We use traditional risk factors and novel oxidative stress biomarkers to predict subjects who are at high or low risk of cardiovascular disease, in which the risk is measured by the thickness of the carotid intima-media or/and the flow-mediated vasodilation. (3) Sulfur amino acid (SAA) intake: We use 1H NMR spectral data of human plasma to classify plasma samples obtained with low SAA intake or high SAA intake. This shows that our method helps for metabolomics study. (4) CpG islands for lung cancer: We identify a large number of sequence patterns (in the order of millions), search candidate patterns from DNA sequences in CpG islands, and look for patterns which can discriminate methylation-prone and methylation-resistant (or in addition, methylation-sporadic) sequences, which relate to early lung cancer prediction.
|
524 |
The asymptotic rate of the length of the longest significant chain with good continuation in Bernoulli net and its applications in filamentary detectionNi, Kai 08 April 2013 (has links)
This thesis is devoted to the detectability of an inhomogeneous region possibly embedded in a noisy environment. It presents models and algorithms using the theory of the longest significant run and percolation. We analyze the computational results based on simulation. We consider the length of the significant nodes in a chain with good continuation in a square lattice of independent nodes. Inspired by the percolation theory, we first analyze the problem in a tree based model. We give the critical probability and find the decay rate of the probability of having a significant run with length k starting at the origin. We find that the asymptotic rate of the length of the significant run can be powerfully applied in the area of image detection. Examples are detection of filamentary structures in a background of uniform random points and target tracking problems. We set the threshold for the rejection region in these problems so that the false positives diminish quickly as we have more samples.
Inspired by the convex set detection, we also give a fast and near optimal algorithm to detect a possibly inhomogeneous chain with good continuation in an image of pixels with white noise. We analyze the length of the longest significant chain after thresholding each pixel and consider the statistics over all significant chains. Such a strategy significantly reduces the complexity of the algorithm. The false positives are eliminated as the number of pixels increases. This extends the existing detection method related to the detection of inhomogeneous line segment in the literature.
|
525 |
A Java Toolbox For Wavelet Based Image DenoisingTuncer, Guney 01 August 2005 (has links) (PDF)
Wavelet methods for image denoising have became widespread for the last decade. The effectiveness of this denoising scheme is influenced by many factors. Highlights can be listed as choosing of wavelet used, the threshold
determination and transform level selection for thresholding. For threshold calculation one of the classical solutions is Wiener filter as a linear estimator. Another one is VisuShrink using global thresholding for nonlinear area. The purpose of this work is to develop a Java toolbox which is used to find best denoising schemes for distinct image types particularly Synthetic Aperture Radar (SAR) images. This can be accomplished by comparing these basic methods with well known data adaptive thresholding methods such as SureShrink, BayeShrink, Generalized Cross Validation and Hypothesis Testing. Some nonwavelet denoising process are also introduced. Along with simple mean and median filters, more statistically adaptive median, Lee, Kuan and Frost filtering techniques are also tested to assist wavelet based denoising scheme. All of these methods on the basis of wavelet models and some traditional methods will be
implemented in pure java code using plug-in concept of ImageJ which is a popular image processing tool written in Java.
|
526 |
Efficient change detection methods for bio and healthcare surveillanceHan, Sung Won 14 June 2010 (has links)
For the last several decades, sequential change point problems have been studied in both the theoretical area (sequential analysis) and the application area (industrial SPC). In the conventional application, the baseline process is assumed to be stationary, and the shift pattern is a step function that is sustained after the shift. However, in biosurveillance, the underlying assumptions of problems are more complicated. This thesis investigates several issues in biosurveillance such as non-homogeneous populations, spatiotemporal surveillance methods, and correlated structures in regional data.
The first part of the thesis discusses popular surveillance methods in sequential change point problems and off-line problems based on count data. For sequential change point problems, the CUSUM and the EWMA have been used in healthcare and public health surveillance to detect increases in the rates of diseases or symptoms. On the other hand, for off-line problems, scan statistics are widely used. In this chapter, we link the method for off-line problems to those for sequential change point problems. We investigate three methods--the CUSUM, the EWMA, and scan statistics--and compare them by conditional expected delay (CED).
The second part of the thesis pertains to the on-line monitoring problem of detecting a change in the mean of Poisson count data with a non-homogeneous population size. The most common detection schemes are based on generalized likelihood ratio statistics, known as an optimal method under Lodern's criteria. We propose alternative detection schemes based on the weighted likelihood ratios and the adaptive threshold method, which perform better than generalized likelihood ratio statistics in an increasing population. The properties of these three detection schemes are investigated by both a theoretical approach and numerical simulation.
The third part of the thesis investigates spatiotemporal surveillance based on likelihood ratios. This chapter proposes a general framework for spatiotemporal surveillance based on likelihood ratio statistics over time windows. We show that the CUSUM and other popular likelihood ratio statistics are the special cases under such a general framework. We compare the efficiency of these surveillance methods in spatiotemporal cases for detecting clusters of incidence using both
Monte Carlo simulations and a real example.
The fourth part proposes multivariate surveillance methods based on likelihood ratio tests in the presence of spatial correlations. By taking advantage of spatial correlations, the proposed methods can perform better than existing surveillance methods by providing the faster and more accurate detection. We illustrate the application of these methods with a breast cancer case in New Hampshire when observations are spatially correlated.
|
527 |
New control charts for monitoring univariate autocorrelated processes and high-dimensional profilesLee, Joongsup 18 August 2011 (has links)
In this thesis, we first investigate the use of automated variance estimators in distribution-free statistical process control (SPC) charts for univariate autocorrelated processes. We introduce two variance estimators---the standardized time series overlapping area estimator and the so-called quick-and-dirty autoregressive estimator---that can be obtained from a training data set and used effectively with distribution-free SPC charts when those charts are applied to processes exhibiting nonnormal responses or correlation between successive responses. In particular, we incorporate the two estimators into DFTC-VE, a new
distribution-free tabular CUSUM chart developed for autocorrelated processes; and we compare its performance with other state-of-the-art distribution-free SPC charts. Using either of the two variance estimators, the DFTC-VE outperforms its competitors in terms of both in-control and
out-of-control average run lengths when all the competing procedures are tested on the same set of independently sampled realizations of selected
autocorrelated processes with normal or nonnormal noise components.
Next, we develop WDFTC, a wavelet-based distribution-free CUSUM chart for detecting shifts in the mean of a high-dimensional profile with noisy components that may exhibit nonnormality, variance heterogeneity, or correlation between profile components. A profile describes the relationship between a selected quality characteristic and an input (design) variable over the experimental region. Exploiting a discrete wavelet transform (DWT) of the mean in-control profile, WDFTC selects a reduced-dimension vector of the associated DWT components from which the mean in-control profile can be
approximated with minimal weighted relative reconstruction error. Based on randomly sampled Phase I (in-control) profiles, the covariance matrix of the corresponding reduced-dimension DWT vectors is estimated using a matrix-regularization method; then the DWT vectors are aggregated (batched) so that the nonoverlapping batch means of the reduced-dimension DWT vectors
have manageable covariances. To monitor shifts in the mean profile during Phase II operation, WDFTC computes a Hotelling's T-square--type statistic from successive nonoverlapping batch means and applies a CUSUM procedure to those statistics, where the associated control limits are evaluated
analytically from the Phase I data. We compare WDFTC with other state-of-the-art profile-monitoring charts using both normal and nonnormal
noise components having homogeneous or heterogenous variances as well as independent or correlated components; and we show that WDFTC performs well, especially for local shifts of small to medium size, in terms of both in-control and out-of-control average run lengths.
|
528 |
Pharmacokinetics of alcohol using breath measures and some statisticalaspects in forensic scienceYang, Chi-ting., 楊志婷. January 2011 (has links)
published_or_final_version / Statistics and Actuarial Science / Doctoral / Doctor of Philosophy
|
529 |
Implementation of process analysis in a three-dimensional air quality modelVizuete, William Gustavo 28 August 2008 (has links)
Not available / text
|
530 |
Designing and analyzing test programs with censored data for civil engineering applicationsFinley, Cynthia 28 August 2008 (has links)
Not available / text
|
Page generated in 0.106 seconds