• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 795
  • 687
  • 106
  • 64
  • 41
  • 40
  • 34
  • 26
  • 11
  • 8
  • 6
  • 5
  • 5
  • 5
  • 5
  • Tagged with
  • 2214
  • 2214
  • 660
  • 658
  • 368
  • 192
  • 188
  • 185
  • 176
  • 157
  • 156
  • 148
  • 122
  • 121
  • 117
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Effective Static Debugging via Compential Set-Based Analysis

January 1997 (has links)
Sophisticated software systems are inherently complex. Understanding, debugging and maintaining such systems requires inferring high-level characteristics of the system's behavior from a myriad of low-level details. For large systems, this quickly becomes an extremely difficult task. MrSpidey is a static debugger that augments the programmers ability to deal with such complex systems. It statically analyzes the program and uses the results of the analysis to identify and highlight any program operation may cause a run-time fault. The programmer can then investigate each potential fault site and, using the graphical explanation facilities of MrSpidey, determine if the fault will really happen or whether the corresponding correctness proof is beyond the analysis's capabilities. In practice, MrSpidey has proven to be an effective tool for debugging program under development and understanding existing programs. The key technology underlying MrSpidey is componential set-based analysis. This is a constraint-based, whole-program analysis for object-oriented and functional programs. The analysis first processes each program component (eg. module or package) independently, generating and simplifying a constraint system describing the data flow behavior of that component. The analysis then combines and solves these simplified constraint systems to yield invariants characterizing the run-time behavior of the entire program. This component-wise approach yields an analysis that handles significantly larger programs than previous analyses of comparable accuracy. The simplification of constraint systems raises a number of questions. In particular, we need to ensure that simplification preserves the observable behavior, or solution space, of a constraint system. This dissertation provides a complete proof-theoretic and algorithmic characterization of the observable behavior of constraint systems, and establishes a close connection between the observable equivalence of constraint systems and the equivalence of regular tree grammars. We exploit this connection to develop a complete algorithm for deciding the observable equivalence of constraint systems, and to adapt a variety of algorithms for simplifying regular tree grammars to the problem of simplifying constraint systems. The resulting constraint simplification algorithms yield an order of magnitude reduction in the size of constraint systems for typical program expressions.
162

Optimal Quality Control for Oligo-arrays Using Genetic Algorithm

Li, Ya-hui 17 August 2004 (has links)
Oligo array is a high throughput technology and is widely used in many scopes of biology and medical researches for quantitative and highly parallel measurements of gene expression. When one faulty step occurs during the synthesis process, it affects all probes using the faulty step. In this thesis, a two-phase genetic algorithm (GA) is proposed to design optimal quality control of oligo array for detecting any single faulty step. The first phase performs the wide search to obtain the approximate solutions and the second phase performs the local search on the approximate solutions to achieve the optimal solution. Besides, the proposed algorithm could hold many non-duplicate individuals and parallelly search multiple regions simultaneously. The superior searching capability of the two-phase GA helps us to find out the 275 nonequireplicate cases that settled by the hill-climbing algorithm. Furthermore, the proposed algorithm also discovers five more open issues.
163

Identifying nonlinear variaiton patterns in multivariate manufacturing processes

Zhang, Feng 17 February 2005 (has links)
This dissertation develops a set of nonlinear variation pattern identification methods that are intended to aid in diagnosing the root causes of product variability in complex manufacturing processes, in which large amounts of high dimensional in-process measurement data are collected for quality control purposes. First, a nonlinear variation pattern model is presented to generically represent a single nonlinear variation pattern that results from a single underlying root cause, the nature of which is unknown a priori. We propose a modified version of a principal curve estimation algorithm for identifying the variation pattern. Principal curve analysis is a nonlinear generalization of principal components analysis (PCA) that lends itself well to interpretation and also has theoretically rich underpinnings. The principal curve modification involves a dimensionality reduction step that is intended to improve estimation accuracy by reducing noise and improving the robustness of the algorithm with the high-dimensional data typically encountered in manufacturing. An effective visualization technique is also developed to help interpret the identified nonlinear variation pattern and aid in root cause identification and elimination. To further improve estimation robustness and accuracy and reduce computational expense, we propose a local PCA based polygonal line algorithm to identify the nonlinear patterns. We also develop an approach for separating and identifying the effects of multiple nonlinear variation patterns that are present simultaneously in the measurement data. This approach utilizes higher order cumulants and pairwise distance based clustering to separate the patterns and borrows from techniques that are used in linear blind source separation. With the groundwork laid for a versatile flexible and powerful nonlinear variation pattern modeling and identification framework, applications in autobody assembly and stamping processes are investigated. The pattern identification algorithms, together with the proposed visualization approach, provides an effective tool to aid in understanding the nature of the root causes of variation that affect a manufacturing process.
164

Department of Defense quality management systems and ISO 9000:2000 /

Lucius, Tommie J. January 2002 (has links) (PDF)
Thesis (M.S.)--Naval Postgraduate School, 2002. / Thesis advisor(s): Michael Boudreau, Ira Lewis. Includes bibliographical references (p. 129-133). Also available online.
165

An investigation of the type I error rates and power of standard and alternative multivariate tests on means under homogeneous and heterogeneous covariance matrices and multivariate normality and nonnormality /

Yockey, Ron David, January 2000 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2000. / Vita. Includes bibliographical references (leaves 316-324). Available also in a digital version from Dissertation Abstracts.
166

Generalized cumulative sum control charts

McCulloh, Ian. Pignatiello, Joseph J., January 2004 (has links)
Thesis (M.S.)--Florida State University, 2004. / Advisor: Dr. Joseph J. Pignatiello, Jr., Florida State University, College of Engineering, Department of Industrial Engineering. Title and description from dissertation home page (viewed June 17, 2004). Includes bibliographical references.
167

Selecting the best process variables for classification of production batches into quality levels

Anzanello, Michel Jose, January 2009 (has links)
Thesis (Ph. D.)--Rutgers University, 2009. / "Graduate Program in Industrial and Systems Engineering." Includes bibliographical references (p. 78-84).
168

Improve safety, health, and environmental protection through the introduction of Six Sigma

Kaliher, Thomas L. January 2003 (has links) (PDF)
Thesis--PlanB (M.S.)--University of Wisconsin--Stout, 2003. / Includes bibliographical references.
169

Current issues surrounding the quality of construction documents

Kenniston, Jody Lynn. January 2003 (has links)
Thesis (M.S.)--Worcester Polytechnic Institute. / Keywords: quality construction documents; computers. Includes bibliographical references (p. 78-81).
170

Performance monitoring of run-to-run control systems used in semiconductor manufacturing

Prabhu, Amogh V., 1983- 31 August 2012 (has links)
Monitoring and diagnosis of the control system, though widely used in the chemical processing industry, is currently lacking in the semiconductor manufacturing industry. This work provides methods for performance assessment of the most commonly used control system in this industry, namely, run-to-run process control. First, an iterative solution method for the calculation of best achievable performance of the widely used run-to-run Exponentially Weighted Moving Average (EWMA) controller is derived. A normalized performance index is then defined based on the best achievable performance. The effect of model mismatch in the process gain and disturbance model parameter, delays, bias changes and nonlinearity in the process is then studied. The utility of the method under manufacturing conditions is tested by analyzing three processes from the semiconductor industry. Missing measurements due to delay are estimated using the disturbance model for the process. A minimum norm estimation method coupled with Tikhonov regularization is developed. Simulations are then carried out to investigate disturbance model mismatch, gain mismatch and different sampling rates. Next, the forward and backward Kalman filter are applied to obtain the missing values and compared with previous examples. Manufacturing data from three processes is then analyzed for different sampling rates. Existing methods are compared with a new method for state estimation in high-mix manufacturing. The new method is based on a random walk model for the context states. This approach is also combined with the recursive equations of the Kalman filter. The method is applied to an industrial exposure process by extending the random walk model into an integrated moving average model and weights used to give preference to the context that is more frequent. Finally, a performance metric is derived for PID controllers, when they are used to control nonlinear processes. Techniques to identify nonlinearity in a process are introduced and polynomial NARX models are proposed to represent a nonlinear process. A performance monitoring technique used for MIMO processes is then applied. Finally, the method is applied to an EWMA control case used before, a P/PI control case from literature and two cases from the semiconductor industry. / text

Page generated in 0.0247 seconds