• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 15
  • 5
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 282
  • 282
  • 101
  • 98
  • 81
  • 67
  • 67
  • 45
  • 39
  • 38
  • 37
  • 37
  • 35
  • 32
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Evaluation of Multivariate Homogenous Arma Model

Tseng, Lucy Chienhua 01 May 1980 (has links)
The purpose of this thesis is to study a restricted multivariate AFRMA model, called the Homogeneous Model. This model is defined as one in which each univariate component of the multivariate model is of the same order in p and q as it is in the multivariate model. From a mathematical respect, multivariate ARMA model is homogeneous if , and only if, its coefficient matrices are diagonal. From a physical respect, the present observation of a phenomenon can be modeled only by it s own past observation and its present and past "errors." The estimation procedures are developed based on maximum likelihood method and on O'Connell' s method for univariate model. The homogeneous model is evaluated by four types of data. Those data are generated reflecting different degrees of nonhomogeneity. It is found that the homogeneous model is sensitive to departures from the homogeneous assumptions. Small departures cause no serious problem, however, large departures are serious.
42

DEVELOPMENTS IN NONPARAMETRIC REGRESSION METHODS WITH APPLICATION TO RAMAN SPECTROSCOPY ANALYSIS

Guo, Jing 01 January 2015 (has links)
Raman spectroscopy has been successfully employed in the classification of breast pathologies involving basis spectra for chemical constituents of breast tissue and resulted in high sensitivity (94%) and specificity (96%) (Haka et al, 2005). Motivated by recent developments in nonparametric regression, in this work, we adapt stacking, boosting, and dynamic ensemble learning into a nonparametric regression framework with application to Raman spectroscopy analysis for breast cancer diagnosis. In Chapter 2, we apply compound estimation (Charnigo and Srinivasan, 2011) in Raman spectra analysis to classify normal, benign, and malignant breast tissue. We explore both the spectra profiles and their derivatives to differentiate different types of breast tissue. In Chapters 3-5 of this dissertation, we develop a novel paradigm for incorporating ensemble learning classification methodology into a nonparametric regression framework. Specifically, in Chapter 3 we set up modified stacking framework and combine different classifiers together to make better predictions in nonparametric regression settings. In Chapter 4 we develop a method by incorporating a modified AdaBoost algorithm in nonparametric regression settings to improve classification accuracy. In Chapter 5 we propose a dynamic ensemble integration based on multiple meta-learning strategies for nonparametric regression based classification. In Chapter 6, we revisit the Raman spectroscopy data in Chapter 2, and make improvements based on the developments of the methods from Chapter 3 to Chapter 4. Finally we summarize the major findings and contributions of this work as well as identify opportunities for future research and their public health implications.
43

Disaster Capitalism: Impact on the Great Flood of 1993

Savard, Katherine J 01 January 2016 (has links)
This thesis attempts to analyze the impact of disaster capitalism on the areas affected by the Great Flood of 1993. Using Naomi Klein’s book, the Shock Doctrine, I selected three variables that can be indicators of disaster capitalism. Unemployment rates, new private housing units authorized by permit, and employment in the mining, logging, and construction industry are used. I use a comparison of means test and a difference-in- differences estimate to find if the variables were changed as a result of the flood. Unemployment rates seemed to be affected by the crisis and strongly support Klein’s theories of disaster capitalism.
44

Adversarial Decision Making in Counterterrorism Applications

Mazicioglu, Dogucan 01 January 2017 (has links)
Our main objective is to improve decision making in counterterrorism applications by implementing expected utility for prescriptive decision making and prospect theory for descriptive modeling. The areas that we aim to improve are behavioral modeling of adversaries with multi objectives in counterterrorism applications and incorporating risk attitudes of decision makers to risk matrices in assessing risk within an adversarial counterterrorism framework. Traditionally, counterterrorism applications have been approached on a single attribute basis. We utilize a multi-attribute prospect theory approach to more realistically model the attacker’s behavior, while using expected utility theory to prescribe the appropriate actions to the defender. We evaluate our approach by considering an attacker with multiple objectives who wishes to smuggle radioactive material into the United States and a defender who has the option to implement a screening process to hinder the attacker. Next, we consider the use of risk matrices (a method widely used for assessing risk given a consequence and a probability pairing of a potential threat) in an adversarial framework – modeling an attacker and defender risk matrix using utility theory and linking the matrices with the Luce model. A shortcoming with modeling the attacker and the defender risk matrix using utility theory is utility theory’s failure to account for the decision makers’ deviation from rational behavior as seen in experimental literature. We consider an adversarial risk matrix framework that models the attacker risk matrix using prospect theory to overcome this shortcoming, while using expected utility theory to prescribe actions to the defender.
45

Quantum Foundations with Astronomical Photons

Leung, Calvin 01 January 2017 (has links)
Bell's inequalities impose an upper limit on correlations between measurements of two-photon states under the assumption that the photons play by a set of local rules rather than by quantum mechanics. Quantum theory and decades of experiments both violate this limit. Recent theoretical work in quantum foundations has demonstrated that a local realist model can explain the non-local correlations observed in experimental tests of Bell's inequality if the underlying probability distribution of the local hidden variable depends on the choice of measurement basis, or ``setting choice''. By using setting choices determined by astrophysical events in the distant past, it is possible to asymptotically guarantee that the setting choice is independent of local hidden variables which come into play around the time of the experiment, closing this ``freedom-of-choice'' loophole. Here, I report on a novel experimental test of Bell's inequality which addresses the freedom-of-choice assumption more conclusively than any other experiment to date. In this first experiment in Vienna, custom astronomical instrumentation allowed setting choices to be determined by photon emission events occurring six hundred years ago at Milky Way stars. For this experiment, I selected the stars used to maximize the extent over which any hidden influence needed to be coordinated. In addition, I characterized the group's custom instrumentation, allowing us to conclude a violation of local realism by $7$ and $11$ standard deviations. These results are published in Handsteiner et. al. (\textit{Phys. Rev. Lett.} 118:060401, 2017). I also describe my design, construction, and experimental characterization of a next-generation ``astronomical random number generator'', with improved capabilities and design choices that result in an improvement on the original instrumentation by an order of magnitude. Through the 1-meter telescope at the NASA/JPL Table Mountain Observatory, I observed and generated random bits from thirteen quasars with redshifts ranging from $z = 0.1-3.9$. With physical and information-theoretic analyses, I quantify the fraction of the generated bits which are predictable by a local realist mechanism, and identify two pairs of quasars suitable for use as extragalactic sources of randomness in the next cosmic Bell test. I also propose two additional applications of such a device. The first is an experimental realization of a delayed-choice quantum eraser experiment, enabling a foundational test of wave-particle complementarity. The second is a test of the Weak Equivalence Principle, using our instrument's sub-nanosecond time resolution to observe the Crab pulsar at optical and near-infrared wavelengths. Using my data from the Crab Pulsar, I report a bound on violations of Einstein's Weak Equivalence Principle complementary to recent results in the literature. Most of these results appear in Leung et. al. (arXiv:1706.02276, submitted to \textit{Physical Review X}).
46

Application of a Bivariate Probit Model to Investigate the Intended Evacuation from Hurricane

Jiang, Fan 28 March 2013 (has links)
With evidence of increasing hurricane risks in Georgia Coastal Area (GCA) and Virginia in the U.S. Southeast and elsewhere, understanding intended evacuation behavior is becoming more and more important for community planners. My research investigates intended evacuation behavior due to hurricane risks, a behavioral survey of the six counties in GCA under the direction of two social scientists with extensive experience in survey research related to citizen and household response to emergencies and disasters. Respondents gave answers whether they would evacuate under both voluntary and mandatory evacuation orders. Bivariate probit models are used to investigate the subjective belief structure of whether or not the respondents are concerned about the hurricane, and the intended probability of evacuating as a function of risk perception, and a lot of demographic and socioeconomic variables (e.g., gender, military, age, length of residence, owning vehicles).
47

A Study of Online Auction Processes using Functional Data Analysis

Ohalete, Nzubechukwu C. 02 June 2022 (has links)
No description available.
48

A µ-Model Approach on the Cell Means: The Analysis of Full, Design Models with Non-Orthogonal Data

Van Koningsveld, Richard 01 May 1979 (has links)
This work considers the application of a µ-model approach on the cell means to a special yet important class of experimental designs. These include full factorial, completely nested, and mixed models with one or more observations per cell. By limiting attention to full models, an approach to the general data situation is developed which is both conceptually simple and computationally advantageous. Conceptually, the method is simple because the design related effects are defined as if the cell means are single observations. This leads to a rather simple algorithm for generating main effect contrasts, from which associated interaction contrasts can also be formed. While the sums of squares found from these contrasts are not additive with non-orthogonal data, they do lead to the class of design related hypotheses with the clearest interpretation in terms of the cells. The computational method is advantageous because the sum of squares for each source of variation is evaluated separately. This avoids the storage and inversion of a potentially large matrix associated with alternative methods, and allows the user to evaluate only those sources of interest. The methodology outlined in this work is programmed into a user-easy, interactive terminal version for the analysis of these n-factor design models.
49

A Comparison of Rank and Bootstrap Procedures for Completely Randomized Designs with Jittering

Lee, Feng-ling 01 May 1987 (has links)
This paper discusses results of a computer simulation to investigate the effect of jittering to simulate measurement error. In addition, the classical F ratio, the bootstrap F and the F for ranked data are compared. Empirical powers and p-values suggest the bootstrap is a good and robust procedure and the rank procedure seems to be too liberal when compared to the classical F ratio.
50

A Confidence Interval Estimate of Percentile

Jou, How Coung 01 May 1980 (has links)
The confidence interval estimate of percentile and its applications were studied. The three methods of estimating a confidence interval were introduced. Some properties of order statistics were reviewed. The Monte Carlo Method -- used to estimate the confidence interval was the most important one among the three methods. The generation of ordered random variables and the estimation of parameters were discussed clearly. The comparison of the three methods showed that the Monte Carlo method would always work, but the K-S and the simplified methods would not.

Page generated in 0.7578 seconds