• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 66
  • 4
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 85
  • 85
  • 25
  • 13
  • 12
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

A Monte Carlo Study of the Robustness and Power Associated with Selected Tests of Variance Equality when Distributions are Non-Normal and Dissimilar in Form

Hardy, James C. (James Clifford) 08 1900 (has links)
When selecting a method for testing variance equality, a researcher should select a method which is robust to distribution non-normality and dissimilarity. The method should also possess sufficient power to ascertain departures from the equal variance hypothesis. This Monte Carlo study examined the robustness and power of five tests of variance equality under specific conditions. The tests examined included one procedure proposed by O'Brien (1978), two by O'Brien (1979), and two by Conover, Johnson, and Johnson (1981). Specific conditions included assorted combinations of the following factors: k=2 and k=3 groups, normal and non-normal distributional forms, similar and dissimilar distributional forms, and equal and unequal sample sizes. Under the k=2 group condition, a total of 180 combinations were examined. A total of 54 combinations were examined under the k=3 group condition. The Type I error rates and statistical power estimates were based upon 1000 replications in each combination examined. Results of this study suggest that when sample sizes are relatively large, all five procedures are robust to distribution non-normality and dissimilarity, as well as being sufficiently powerful.
62

A failure detection system design methodology

Chow, Edward Yik January 1981 (has links)
Thesis (Sc.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1981. / MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING. / Includes bibliographical references. / by Edward Yik Chow. / Sc.D.
63

Robust methods of finite element analysis : evaluation of non-linear, lower bound limit loads of plated structures and stiffening members /

Ralph, Freeman E., January 2000 (has links)
Thesis (M.Eng.)--Memorial University of Newfoundland, 2000. / Bibliography: p. 132-135.
64

Uncalibrated Vision-Based Control and Motion Planning of Robotic Arms in Unstructured Environments

Shademan, Azad Unknown Date
No description available.
65

Essays on theories and applications of spatial econometric models

Lin, Xu, January 2006 (has links)
Thesis (Ph. D.)--Ohio State University, 2006. / Title from first page of PDF file. Includes bibliographical references (p. 114-119).
66

Signal Processing and Robust Statistics for Fault Detection in Photovoltaic Arrays

January 2012 (has links)
abstract: Photovoltaics (PV) is an important and rapidly growing area of research. With the advent of power system monitoring and communication technology collectively known as the "smart grid," an opportunity exists to apply signal processing techniques to monitoring and control of PV arrays. In this paper a monitoring system which provides real-time measurements of each PV module's voltage and current is considered. A fault detection algorithm formulated as a clustering problem and addressed using the robust minimum covariance determinant (MCD) estimator is described; its performance on simulated instances of arc and ground faults is evaluated. The algorithm is found to perform well on many types of faults commonly occurring in PV arrays. Among several types of detection algorithms considered, only the MCD shows high performance on both types of faults. / Dissertation/Thesis / M.S. Electrical Engineering 2012
67

Využití numerické lineární algebry k urychlení výpočtu odhadů MCD / Exploiting numerical linear algebra to accelerate the computation of the MCD estimator

Sommerová, Kristýna January 2018 (has links)
This work is dealing with speeding up the algorithmization of the MCD es- timator for detection of the mean and the covariance matrix of a normally dis- tributed multivariate data contaminated with outliers. First, the main idea of the estimator and its well-known aproximation by the FastMCD algorithm is discussed. The main focus was to be placed on possibilities of a speedup of the iteration step known as C-step while maintaining the quality of the estimations. This proved to be problematic, if not impossible. The work is, therefore, aiming at creating a new implementation based on the C-step and Jacobi method for eigenvalues. The proposed JacobiMCD algorithm is compared to the FastMCD in terms of floating operation count and results. In conclusion, JacobiMCD is not found to be fully equivalent to FastMCD but hints at a possibility of its usage on larger problems. The numerical experiments suggest that the computation can indeed be quicker by an order of magnitude, while the quality of results is close to those from FastMCD in some settings. 1
68

Segmentation de processus avec un bruit autorégressif / Segmenting processes with an autoregressive noise

Chakar, Souhil 22 September 2015 (has links)
Nous proposons d’étudier la méthodologie de la segmentation de processus avec un bruit autorégressif sous ses aspects théoriques et pratiques. Par « segmentation » on entend ici l’inférence de points de rupture multiples correspondant à des changements abrupts dans la moyenne de la série temporelle. Le point de vue adopté est de considérer les paramètres de l’autorégression comme des paramètres de nuisance, à prendre en compte dans l’inférence dans la mesure où cela améliore la segmentation.D’un point de vue théorique, le but est de conserver un certain nombre de propriétés asymptotiques de l’estimation des points de rupture et des paramètres propres à chaque segment. D’un point de vue pratique, on se doit de prendre en compte les limitations algorithmiques liées à la détermination de la segmentation optimale. La méthode proposée, doublement contrainte, est basée sur l’utilisation de techniques d’estimation robuste permettant l’estimation préalable des paramètres de l’autorégression, puis la décorrélation du processus, permettant ainsi de s’approcher du problème de la segmentation dans le cas d’observations indépendantes. Cette méthode permet l’utilisation d’algorithmes efficaces. Elle est assise sur des résultats asymptotiques que nous avons démontrés. Elle permet de proposer des critères de sélection du nombre de ruptures adaptés et fondés. Une étude de simulations vient l’illustrer. / We propose to study the methodology of autoregressive processes segmentation under both its theoretical and practical aspects. “Segmentation” means here inferring multiple change-points corresponding to mean shifts. We consider autoregression parameters as nuisance parameters, whose estimation is considered only for improving the segmentation.From a theoretical point of view, we aim to keep some asymptotic properties of change-points and other parameters estimators. From a practical point of view, we have to take into account the algorithmic constraints to get the optimal segmentation. To meet these requirements, we propose a method based on robust estimation techniques, which allows a preliminary estimation of the autoregression parameters and then the decorrelation of the process. The aim is to get our problem closer to the segmentation in the case of independent observations. This method allows us to use efficient algorithms. It is based on asymptotic results that we proved. It allows us to propose adapted and well-founded number of changes selection criteria. A simulation study illustrates the method.
69

A unified decision analysis framework for robust system design evaluation in the face of uncertainty

Duan, Chunming 06 June 2008 (has links)
Some engineered systems now in use are not adequately meeting the needs for which they were developed, nor are they very cost-effective in terms of consumer utilization. Many problems associated with unsatisfactory system performance and high life-cycle cost are the direct result of decisions made during early phases of system design. To develop quality systems, both engineering and management need fundamental principles and methodologies to guide decision making during system design and advanced planning. In order to provide for the efficient resolution of complex system design decisions involving uncertainty, human judgments, and value tradeoffs, an efficient and effective decision analysis framework is required. Experience indicates that an effective approach to improving the quality of detail designs is through the application of Genichi Taguchi's philosophy of robust design. How to apply Taguchi's philosophy of robust design to system design evaluation at the preliminary design stage is an open question. The goal of this research is to develop a unified decision analysis framework to support the need for developing better system designs in the face of various uncertainties. This goal is accomplished by adapting and integrating statistical decision theory, utility theory, elements of the systems engineering process, and Taguchi's philosophy of robust design. The result is a structured, systematic methodology for evaluating system design alternatives. The decision analysis framework consists of two parts: (1) decision analysis foundations, and (2) an integrated approach. Part I (Chapters 2 through 5) covers the foundations for design decision analysis in the face of uncertainty. This research begins with an examination of the life cycle of engineered systems and identification of the elements of the decision process of system design and development. After investigating various types of uncertainty involved in the process of system design, the concept of robust design is defined from the perspective of system life-cycle engineering. Some common measures for assessing the robustness of candidate system designs are then identified and examined. Then the problem of design evaluation in the face of uncertainty is studied within the context of decision theory. After classifying design decision problems into four categories, the structure of each type of problem in terms of sequence and causal relationships between various decisions and uncertain outcomes is represented by a decision tree. Based upon statistical decision theory, the foundations for choosing a best design in the face of uncertainty are identified. The assumptions underlying common objective functions in design optimization are also investigated. Some confusion and controversy which surround Taguchi's robust design criteria — loss functions and signal-to-noise ratios -- are addressed and clarified. Part Il (Chapters 6 through 9) covers models and their application to design evaluation in the face of uncertainty. Based upon the decision analysis foundations, an integrated approach is developed and presented for resolving beth discrete decisions, continuous decisions, and decisions involving both uncertainty and multiple attributes. Application of the approach is illustrated by two hypothetical examples: bridge design and repairable equipment population system design. / Ph. D.
70

Hydroxypropylmethylcellulose: A New Matrix for Solid-Surface Room-Temperature Phosphorimetry

Hamner, Vincent N. 05 November 1999 (has links)
This thesis reports an investigation of hydroxypropylmethylcellulose (HPMC) as a new solid-surface room-temperature phosphorescence (SSRTP) sample matrix. The high background phosphorescence originating from filter paper substrates can interfere with the detection and quantitation of trace-level analytes. High-purity grades of HPMC were investigated as SSRTP substrates in an attempt to overcome this limitation. When compared directly to filter paper, HPMC allows the spectroscopist to achieve greater sensitivity, lower limits of detection (LOD), and lower limits of quantitation (LOQ) for certain phosphor/heavy-atom combinations since SSRTP signal intensities are stronger. For example, the determination of the analytical figures of merit for a naphthalene/sodium iodide/HPMC system resulted in a calibration sensitivity of 2.79, LOD of 4 ppm (3 ng), and LOQ of 14 ppm (11 ng). Corresponding investigations of a naphthalene/sodium iodide/filter paper system produced a calibration sensitivity of 0.326, LOD of 33 ppm (26 ng), and LOQ of 109 ppm (86 ng). Extended purging with dry-nitrogen gas yields improved sensitivities, lower LOD's, and lower LOQ's in HPMC matrices when LOD and LOQ are calculated according to the IUPAC guidelines.To test the universality of HPMC, qualitative SSRTP spectra were obtained for a wide variety of probe phosphors offering different molecular sizes, shapes, and chemical functionalities. Suitable spectra were obtained for the following model polycyclic aromatic hydrocarbons (PAHs): naphthalene, p-aminobenzoic acid, acenaphthene, phenanthrene, 2-naphthoic acid, 2-naphthol, salicylic acid, and triphenylene.Filter paper and HPMC substrates are inherently anisotropic, non-heterogeneous media. Since this deficiency cannot be addressed experimentally, a robust statistical method is examined for the detection of questionable SSRTP data points and the deletion of outlying observations. If discordant observations are discarded, relative standard deviations are typically reduced to less than 10% for most SSRTP data sets. Robust techniques for outlier identification are superior to traditional methods since they operate at a high level of efficiency and are immune to masking effects.The process of selecting a suitable sample support material often involves considerable trial-and-error on the part of the analyst. A mathematical model based on Hansen's cohesion parameter theory is developed to predict favorable phosphor-substrate attraction and interactions. The results of investigations using naphthalene as a probe phosphor and sodium iodide as an external heavy-atom enhancer support the cohesion parameter model.This document includes a thorough description of the fundamental principles of phosphorimetry and provides a detailed analysis of the theoretical and practical concerns associated with performing SSRTP. In order to better understand the properties of both filter paper and HPMC, a chapter is devoted to the discussion of the cellulose biopolymer. Experimental results and interpretations are presented and suggestions for future investigations are provided. Together, these results provide a framework that will support additional advancements in the field of solid-surface room-temperature phosphorescence spectroscopy. / Ph. D.

Page generated in 0.0631 seconds