• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • 1
  • Tagged with
  • 8
  • 8
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Bulk sampling : some strategies for improving quality control in chemical industries

Girardi, Benur A. January 1993 (has links)
No description available.
2

Data mining temporal and indefinite relations with numerical dependencies

Collopy, Ethan Richard January 1999 (has links)
No description available.
3

Estimation of measurement uncertainty in the sampling of contaminated land

Argyraki, Ariadni January 1997 (has links)
No description available.
4

An Empirical Investigation of Tukey's Honestly Significant Difference Test with Variance Heterogeneity and Unequal Sample Sizes, Utilizing Kramer's Procedure and the Harmonic Mean

McKinney, William Lane 05 1900 (has links)
This study sought to determine the effect upon Tukey's Honestly Significant Difference (HSD) statistic of concurrently violating the assumptions of homogeneity of variance and equal sample sizes. Two forms for the unequal sample size problem were investigated. Kramer's form and the harmonic mean approach were the two unequal sample size procedures studied. The study employed a Monte Carlo simulation procedure which varied sample sizes with a heterogeneity of variance condition. Four thousand experiments were generated. Findings of this study were based upon the empirically obtained significance levels. Five conclusions were reached in this study. The first conclusion was that for the conditions of this study the Kramer form of the HSD statistic is not robust at the .05 or .01 nominal level of significance. A second conclusion was that the harmonic mean form of the HSD statistic is not robust at the .05 and .01 nominal level of significance. A general conclusion reached from all the findings formed the third conclusion. It was that the Kramer form of the HSD test is the preferred procedure under combined assumption violations of variance heterogeneity and unequal sample sizes. Two additional conclusions are based on related findings. The fourth conclusion was that for the combined assumption violations in this study, the actual significance levels (probability levels) were less-than the nominal significance levels when the magnitude of the unequal variances were positively related to the magnitude of the unequal sample sizes. The fifth and last conclusion was that for the concurrent assumption violation of variance heterogeneity and unequal sample sizes, the actual significance levels significantly exceed the nominal significance levels when the magnitude of the unequal variances are negatively related to the magnitude of the unequal sample sizes.
5

A Study of Deploying Monitor-Oriented System Simulation Models to Improve the Efficiency of Statistical Process Control

Su, Yung-Chi 06 August 2011 (has links)
The development of statistical process control has been for a long time and can be turned up in many manufacturing environments. However, statistical process control applications in process control generally limited to use the control chart applications, the deepening capacity for control charts such as process capability control, variation detection and evaluation, are rarely described so often so that statistical process control techniques is relegated. Meanwhile, statistical process control can detect the production process of the variations, but it can¡¦t integrate the production resource capacity. Although the process control of manufacturing processes can achieve real-time control of effects, but the resources of the production process appeared to be quite inadequate in response to future demand forecast and capacity analysis. Therefore, this study combined with statistical process control system simulation technology for innovative management. Through the process observation and sample collection, we can use simulation technology to propose the process feasibility and applicability in resource constraint and resource allocation for considering the variation of the statistical process control, and use the quality improvement tools and causal feedback map, the system dynamics tools, in the resource dynamic ability for decision-making management. The research result appears: 1¡BBased on the effective input parameters of simulation model , it can effectively simulate the actual production processes and produce an effective output. 2¡BThrough the appropriate statistical data validation, it can improve the sample reliability as an important reference to system simulation methods. 3¡BUsing the simulation technology, we can monitor the online process control, production resources allocation and capacity prediction.
6

Accelerated Fuzzy Clustering

Parker, Jonathon Karl 01 January 2013 (has links)
Clustering algorithms are a primary tool in data analysis, facilitating the discovery of groups and structure in unlabeled data. They are used in a wide variety of industries and applications. Despite their ubiquity, clustering algorithms have a flaw: they take an unacceptable amount of time to run as the number of data objects increases. The need to compensate for this flaw has led to the development of a large number of techniques intended to accelerate their performance. This need grows greater every day, as collections of unlabeled data grow larger and larger. How does one increase the speed of a clustering algorithm as the number of data objects increases and at the same time preserve the quality of the results? This question was studied using the Fuzzy c-means clustering algorithm as a baseline. Its performance was compared to the performance of four of its accelerated variants. Four key design principles of accelerated clustering algorithms were identified. Further study and exploration of these principles led to four new and unique contributions to the field of accelerated fuzzy clustering. The first was the identification of a statistical technique that can estimate the minimum amount of data needed to ensure a multinomial, proportional sample. This technique was adapted to work with accelerated clustering algorithms. The second was the development of a stopping criterion for incremental algorithms that minimizes the amount of data required, while maximizing quality. The third and fourth techniques were new ways of combining representative data objects. Five new accelerated algorithms were created to demonstrate the value of these contributions. One additional discovery made during the research was that the key design principles most often improve performance when applied in tandem. This discovery was applied during the creation of the new accelerated algorithms. Experiments show that the new algorithms improve speedup with minimal quality loss, are demonstrably better than related methods and occasionally are an improvement in both speedup and quality over the base algorithm.
7

Uvolnění nakupovaných dílů do sériové výroby bez vstupní kontroly / Purchased Components Release for Serial Production without Incoming Inspection

Bil, Miroslav January 2015 (has links)
The theoretical part of this diploma thesis describes methods of incoming inspection with an emphasis on statistical sampling by attributes. The practical part contains an analysis of the qualifying process of suppliers and an analysis of the system of incoming inspection in the Kollmorgen Company. The outcome of the thesis is the system of sampling plans, review of the qualification process and setting up a parameter for monitoring the released parts into serial production.
8

Porovnání účinnosti návrhů experimentů pro statistickou analýzu úloh s náhodnými vstupy / Performance comparison of methods for design of experiments for analysis of tasks involving random variables

Martinásková, Magdalena January 2014 (has links)
The thesis presents methods and criteria for creation and optimization of design of computer experiments. Using the core of a program Freet the optimized designs were created by combination of these methods and criteria. Then, the suitability of the designs for statistical analysis of the tasks vith input random variables was assessed by comparison of the obtained results of six selected functions and the exact (analytically obtained) solutions. Basic theory, definitions of the evaluated functions, description of the setting of optimization and the discussion of the obtained results, including recommendations related to identified weaknesses of certain designs, are presented. The thesis also contains a description of an application that was created to display the results.

Page generated in 0.0382 seconds