41 |
Statistical Analysis for Tolerances of Noxious Weed SeedsDodge, Yadolah 01 May 1971 (has links)
An analysis of the previous method for testing tolerances of noxious weed seeds was performed. Problems of the current techniques were discussed, and the solution to these problems was given.
A new technique of testing through the sequential test ratio was developed, and results examined.
The sequential test was found to be useful enough to include the use of it in determining tolerances for noxious weed seeds.
This study did show that the use of sequential tests does have excellent potential and flexibility as a statistical tool for the tolerances of noxious weed seeds. (75 pages)
|
42 |
Evaluation of Multivariate Homogenous Arma ModelTseng, Lucy Chienhua 01 May 1980 (has links)
The purpose of this thesis is to study a restricted multivariate AFRMA model, called the Homogeneous Model. This model is defined as one in which each univariate component of the multivariate model is of the same order in p and q as it is in the multivariate model.
From a mathematical respect, multivariate ARMA model is homogeneous if , and only if, its coefficient matrices are diagonal. From a physical respect, the present observation of a phenomenon can be modeled only by it s own past observation and its present and past "errors."
The estimation procedures are developed based on maximum likelihood method and on O'Connell' s method for univariate model.
The homogeneous model is evaluated by four types of data. Those data are generated reflecting different degrees of nonhomogeneity.
It is found that the homogeneous model is sensitive to departures from the homogeneous assumptions. Small departures cause no serious problem, however, large departures are serious.
|
43 |
DEVELOPMENTS IN NONPARAMETRIC REGRESSION METHODS WITH APPLICATION TO RAMAN SPECTROSCOPY ANALYSISGuo, Jing 01 January 2015 (has links)
Raman spectroscopy has been successfully employed in the classification of breast pathologies involving basis spectra for chemical constituents of breast tissue and resulted in high sensitivity (94%) and specificity (96%) (Haka et al, 2005). Motivated by recent developments in nonparametric regression, in this work, we adapt stacking, boosting, and dynamic ensemble learning into a nonparametric regression framework with application to Raman spectroscopy analysis for breast cancer diagnosis. In Chapter 2, we apply compound estimation (Charnigo and Srinivasan, 2011) in Raman spectra analysis to classify normal, benign, and malignant breast tissue. We explore both the spectra profiles and their derivatives to differentiate different types of breast tissue. In Chapters 3-5 of this dissertation, we develop a novel paradigm for incorporating ensemble learning classification methodology into a nonparametric regression framework. Specifically, in Chapter 3 we set up modified stacking framework and combine different classifiers together to make better predictions in nonparametric regression settings. In Chapter 4 we develop a method by incorporating a modified AdaBoost algorithm in nonparametric regression settings to improve classification accuracy. In Chapter 5 we propose a dynamic ensemble integration based on multiple meta-learning strategies for nonparametric regression based classification. In Chapter 6, we revisit the Raman spectroscopy data in Chapter 2, and make improvements based on the developments of the methods from Chapter 3 to Chapter 4. Finally we summarize the major findings and contributions of this work as well as identify opportunities for future research and their public health implications.
|
44 |
Disaster Capitalism: Impact on the Great Flood of 1993Savard, Katherine J 01 January 2016 (has links)
This thesis attempts to analyze the impact of disaster capitalism on the areas affected by the Great Flood of 1993. Using Naomi Klein’s book, the Shock Doctrine, I selected three variables that can be indicators of disaster capitalism. Unemployment rates, new private housing units authorized by permit, and employment in the mining, logging, and construction industry are used. I use a comparison of means test and a difference-in- differences estimate to find if the variables were changed as a result of the flood. Unemployment rates seemed to be affected by the crisis and strongly support Klein’s theories of disaster capitalism.
|
45 |
Adversarial Decision Making in Counterterrorism ApplicationsMazicioglu, Dogucan 01 January 2017 (has links)
Our main objective is to improve decision making in counterterrorism applications by implementing expected utility for prescriptive decision making and prospect theory for descriptive modeling. The areas that we aim to improve are behavioral modeling of adversaries with multi objectives in counterterrorism applications and incorporating risk attitudes of decision makers to risk matrices in assessing risk within an adversarial counterterrorism framework. Traditionally, counterterrorism applications have been approached on a single attribute basis. We utilize a multi-attribute prospect theory approach to more realistically model the attacker’s behavior, while using expected utility theory to prescribe the appropriate actions to the defender. We evaluate our approach by considering an attacker with multiple objectives who wishes to smuggle radioactive material into the United States and a defender who has the option to implement a screening process to hinder the attacker. Next, we consider the use of risk matrices (a method widely used for assessing risk given a consequence and a probability pairing of a potential threat) in an adversarial framework – modeling an attacker and defender risk matrix using utility theory and linking the matrices with the Luce model. A shortcoming with modeling the attacker and the defender risk matrix using utility theory is utility theory’s failure to account for the decision makers’ deviation from rational behavior as seen in experimental literature. We consider an adversarial risk matrix framework that models the attacker risk matrix using prospect theory to overcome this shortcoming, while using expected utility theory to prescribe actions to the defender.
|
46 |
Quantum Foundations with Astronomical PhotonsLeung, Calvin 01 January 2017 (has links)
Bell's inequalities impose an upper limit on correlations between measurements of two-photon states under the assumption that the photons play by a set of local rules rather than by quantum mechanics. Quantum theory and decades of experiments both violate this limit.
Recent theoretical work in quantum foundations has demonstrated that a local realist model can explain the non-local correlations observed in experimental tests of Bell's inequality if the underlying probability distribution of the local hidden variable depends on the choice of measurement basis, or ``setting choice''. By using setting choices determined by astrophysical events in the distant past, it is possible to asymptotically guarantee that the setting choice is independent of local hidden variables which come into play around the time of the experiment, closing this ``freedom-of-choice'' loophole.
Here, I report on a novel experimental test of Bell's inequality which addresses the freedom-of-choice assumption more conclusively than any other experiment to date. In this first experiment in Vienna, custom astronomical instrumentation allowed setting choices to be determined by photon emission events occurring six hundred years ago at Milky Way stars. For this experiment, I selected the stars used to maximize the extent over which any hidden influence needed to be coordinated. In addition, I characterized the group's custom instrumentation, allowing us to conclude a violation of local realism by $7$ and $11$ standard deviations. These results are published in Handsteiner et. al. (\textit{Phys. Rev. Lett.} 118:060401, 2017).
I also describe my design, construction, and experimental characterization of a next-generation ``astronomical random number generator'', with improved capabilities and design choices that result in an improvement on the original instrumentation by an order of magnitude. Through the 1-meter telescope at the NASA/JPL Table Mountain Observatory, I observed and generated random bits from thirteen quasars with redshifts ranging from $z = 0.1-3.9$. With physical and information-theoretic analyses, I quantify the fraction of the generated bits which are predictable by a local realist mechanism, and identify two pairs of quasars suitable for use as extragalactic sources of randomness in the next cosmic Bell test. I also propose two additional applications of such a device. The first is an experimental realization of a delayed-choice quantum eraser experiment, enabling a foundational test of wave-particle complementarity. The second is a test of the Weak Equivalence Principle, using our instrument's sub-nanosecond time resolution to observe the Crab pulsar at optical and near-infrared wavelengths. Using my data from the Crab Pulsar, I report a bound on violations of Einstein's Weak Equivalence Principle complementary to recent results in the literature. Most of these results appear in Leung et. al. (arXiv:1706.02276, submitted to \textit{Physical Review X}).
|
47 |
Application of a Bivariate Probit Model to Investigate the Intended Evacuation from HurricaneJiang, Fan 28 March 2013 (has links)
With evidence of increasing hurricane risks in Georgia Coastal Area (GCA) and Virginia in the U.S. Southeast and elsewhere, understanding intended evacuation behavior is becoming more and more important for community planners. My research investigates intended evacuation behavior due to hurricane risks, a behavioral survey of the six counties in GCA under the direction of two social scientists with extensive experience in survey research related to citizen and household response to emergencies and disasters. Respondents gave answers whether they would evacuate under both voluntary and mandatory evacuation orders. Bivariate probit models are used to investigate the subjective belief structure of whether or not the respondents are concerned about the hurricane, and the intended probability of evacuating as a function of risk perception, and a lot of demographic and socioeconomic variables (e.g., gender, military, age, length of residence, owning vehicles).
|
48 |
A Study of Online Auction Processes using Functional Data AnalysisOhalete, Nzubechukwu C. 02 June 2022 (has links)
No description available.
|
49 |
A µ-Model Approach on the Cell Means: The Analysis of Full, Design Models with Non-Orthogonal DataVan Koningsveld, Richard 01 May 1979 (has links)
This work considers the application of a µ-model approach on the cell means to a special yet important class of experimental designs. These include full factorial, completely nested, and mixed models with one or more observations per cell. By limiting attention to full models, an approach to the general data situation is developed which is both conceptually simple and computationally advantageous.
Conceptually, the method is simple because the design related effects are defined as if the cell means are single observations. This leads to a rather simple algorithm for generating main effect contrasts, from which associated interaction contrasts can also be formed. While the sums of squares found from these contrasts are not additive with non-orthogonal data, they do lead to the class of design related hypotheses with the clearest interpretation in terms of the cells.
The computational method is advantageous because the sum of squares for each source of variation is evaluated separately. This avoids the storage and inversion of a potentially large matrix associated with alternative methods, and allows the user to evaluate only those sources of interest.
The methodology outlined in this work is programmed into a user-easy, interactive terminal version for the analysis of these n-factor design models.
|
50 |
A Comparison of Rank and Bootstrap Procedures for Completely Randomized Designs with JitteringLee, Feng-ling 01 May 1987 (has links)
This paper discusses results of a computer simulation to investigate the effect of jittering to simulate measurement error. In addition, the classical F ratio, the bootstrap F and the F for ranked data are compared. Empirical powers and p-values suggest the bootstrap is a good and robust procedure and the rank procedure seems to be too liberal when compared to the classical F ratio.
|
Page generated in 0.0952 seconds