• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 405
  • 234
  • 55
  • 31
  • 18
  • 11
  • 9
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • Tagged with
  • 966
  • 966
  • 239
  • 231
  • 185
  • 145
  • 140
  • 127
  • 116
  • 110
  • 98
  • 73
  • 69
  • 68
  • 65
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
651

Product quality modeling and control based on vision inspection with an application to baking processes

Zhang, Yingchuan 14 April 2005 (has links)
Manufacturing industries are facing major challenges in terms of improving product quality and increasing throughput while sustaining production costs to acceptable levels. Product-oriented processes, both legacy and new, are poorly monitored and controlled on the basis of distributed loop controllers that are aiming to maintain critical process variables within acceptable bounds. Thus, poor quality product results when such processes are subjected to large disturbances - operational failures, environmental changes, and changes in loading conditions. In this research, product quality modeling and control based on a vision inspection methodology is proposed to improve product quality and increase productivity. The main contributions of this research are twofold. First, this research introduces a product quality modeling methodology that combines both physical-based modeling and data-driven modeling. The quality model is the link between information coming from the inspection of product features and the specification of process control strategies. It is essential to control and optimize the process. Physical-based modeling is used to model the product temperature profile, and data-driven modeling is used to train the mapping from the product temperature profile to each quality metric. The break down of the sub models increase the flexibility of model development and reduce the effort to change the model when the quality metrics change. The second contribution is the development of a novel approach to control product quality based on vision inspection, which is developed as part of a hybrid, hierarchical architecture. The high-level control module involves scheduling of multiple plant processes, diagnostics of the failure condition in the process, and the supervision of the whole process. The mid-level control module, which is the focus of the work presented here, takes advantage of baking product quality indicators and oven parameter measurements to optimize zone temperature and conveyor speed set points so that the best product quality is achieved even in the presence of disturbances. The low-level control module consists of basic control loops. Each of them controls parameters of each operation in the process separately. They are generally simple and easy to implement.
652

Dynamic Real-time Optimization and Control of an Integrated Plant

Tosukhowong, Thidarat 25 August 2006 (has links)
Applications of the existing steady-state plant-wide optimization and the single-scale fast-rate dynamic optimization strategies to an integrated plant with material recycle have been impeded by several factors. While the steady-state optimization formulation is very simple, the very long transient dynamics of an integrated plant have limited the optimizers execution rate to be extremely low, yielding a suboptimal performance. In contrast, performing dynamic plant-wide optimization at the same rate as local controllers requires exorbitant on-line computational load and may increase the sensitivity to high-frequency dynamics that are irrelevant to the plant-level interactions, which are slow-scale in nature. This thesis proposes a novel multi-scale dynamic optimization and control strategy suitable for an integrated plant. The dynamic plant-wide optimizer in this framework executes at a slow rate to track the slow-scale plant-wide interactions and economics, while leaving the local controllers to handle fast changes related to the local units. Moreover, this slow execution rate demands less computational and modeling requirement than the fast-rate optimizer. An important issue of this method is obtaining a suitable dynamic model when first-principles are unavailable. The difficulties in the system identification process are designing proper input signal to excite this ill-conditioned system and handling the lack of slow-scale dynamic data when the plant experiment cannot be conducted for a long time compared to the settling time. This work presents a grey-box modeling method to incorporate steady-state information to improve the model prediction accuracy. A case study of an integrated plant example is presented to address limitations of the nonlinear model predictive control (NMPC) in terms of the on-line computation and its inability to handle stochastic uncertainties. Then, the approximate dynamic programming (ADP) framework is investigated. This method computes an optimal operating policy under uncertainties off-line. Then, the on-line multi-stage optimization can be transformed into a single-stage problem, thus reducing the real-time computational effort drastically. However, the existing ADP framework is not suitable for an integrated plant with high dimensional state and action space. In this study, we combine several techniques with ADP to apply nonlinear optimal control to the integrated plant example and show its efficacy over NMPC.
653

Changing Labour Market Positions And Workplace Interactions Of Irregular Moldovan Migrants: The Case Of Textile/clothing Sector In Istanbul, Turkey

Dagdelen, Gorkem 01 August 2008 (has links) (PDF)
The new international division of labour has transformed the economic structure of Turkey from an import-substituted to an export-oriented economy. Starting from the early 90s, many Moldovan migrants began to come to Turkey in order to work temporarily in the informal economy. They worked in clothing and shoe ateliers until the beginning of this century. Nowadays many Moldovan migrants work in clothing shops as Russian-speaking sales assistants and in the cargo firms as carriers. Based on this historical context, this study explores the changing labour market position and workplace interactions of irregular Moldovan migrants, who are working in the textile/clothing sector in Istanbul, Turkey. I firstly try to understand the mechanisms of the changing labour market positions of irregular migrants by focusing on the factors and agents behind these dynamic processes. Secondly, I intend to analyze the labour process control regimes and resistance in the workplaces where migrants work. With this aim in view, I conducted field research in Istanbul consisting of 35 in depth and informal interviews with Moldovan migrants, Turkish employers and Turkish employees. As a result of the analyses of my findings, I first observed that although foreign workers cannot change the exploitative working conditions, they can find ways of escaping from exploitative working conditions in a context. Secondly, the level of exploitation in informal working conditions are not only determined by the necessities of capitalist accumulation regimes and the migration policies of the state but also by the preferences of employers based on economic and cultural motives but also.
654

Controlling High Quality Manufacturing Processes: A Robustness Study Of The Lower-sided Tbe Ewma Procedure

Pehlivan, Canan 01 September 2008 (has links) (PDF)
In quality control applications, Time-Between-Events (TBE) type observations may be monitored by using Exponentially Weighted Moving Average (EWMA) control charts. A widely accepted model for the TBE processes is the exponential distribution, and hence TBE EWMA charts are designed under this assumption. Nevertheless, practical applications do not always conform to the theory and it is common that the observations do not fit the exponential model. Therefore, control charts that are robust to departures from the assumed distribution are desirable in practice. In this thesis, robustness of the lower-sided TBE EWMA charts to the assumption of exponentially distributed observations has been investigated. Weibull and lognormal distributions are considered in order to represent the departures from the assumed exponential model and Markov Chain approach is utilized for evaluating the performance of the chart. By analyzing the performance results, design settings are suggested in order to achieve robust lower-sided TBE EWMA charts.
655

A Two-sided Cusum For First-order Integer-valued Autoregressive Processes Of Poisson Counts

Yontay, Petek 01 July 2011 (has links) (PDF)
Count data are often encountered in manufacturing and service industries due to ease of data collection. These counts can be useful in process monitoring to detect shifts of a process from an in-control state to various out-of-control states. It is usually assumed that the observations are independent and identically distributed. However, in practice, observations may be autocorrelated and this may adversely affect the performance of the control charts developed under the assumption of independence. In this thesis, the cumulative sum (CUSUM) control chart for monitoring autocorrelated processes of counts is investigated. To describe the autocorrelation structure of counts, a Poisson integer-valued autoregressive moving average model of order 1, Poisson INAR(1), is employed. Changes in the process mean in both positive and negative directions are taken into account while designing the CUSUM chart. A trivariate Markov Chain approach is utilized for evaluating the performance of the chart.
656

Optimal filter design approaches to statistical process control for autocorrelated processes

Chin, Chang-Ho 01 November 2005 (has links)
Statistical Process Control (SPC), and in particular control charting, is widely used to achieve and maintain control of various processes in manufacturing. A control chart is a graphical display that plots quality characteristics versus the sample number or the time line. Interest in effective implementation of control charts for autocorrelated processes has increased in recent years. However, because of the complexities involved, few systematic design approaches have thus far been developed. Many control charting methods can be viewed as the charting of the output of a linear filter applied to the process data. In this dissertation, we generalize the concept of linear filters for control charts and propose new control charting schemes, the general linear filter (GLF) and the 2nd-order linear filter, based on the generalization. In addition, their optimal design methodologies are developed, where the filter parameters are optimally selected to minimize the out-of-control Average Run Length (ARL) while constraining the in-control ARL to some desired value. The optimal linear filters are compared with other methods in terms of ARL performance, and a number of their interesting characteristics are discussed for various types of mean shifts (step, spike, sinusoidal) and various ARMA process models (i.i.d., AR(1), ARMA(1,1)). Also, in this work, a new discretization approach for substantially reducing the computational time and memory use for the Markov chain method of calculating the ARL is proposed. Finally, a gradient-based optimization strategy for searching optimal linear filters is illustrated.
657

A framework of statistical process control for software development

Shih, Tsung-Yo 03 August 2009 (has links)
With the globalization era, software companies around the world not only have to face competition in the domestic industry, as well as the subsequent challenge of large international companies. For this reason, domestic software companies must to upgrade their own software quality. Domestic government agencies and non-governmental units together promote Capability Maturity Model Integration (CMMI). Hope to improve their quality of software development process through internationalized professional evaluation. Towards the high-maturity software development process, software development process should be estimated quantitatively in CMMI Level 4. There are frequently used statistical process control (SPC) methods, including control charts, fishbone diagram, pareto charts ... and other related practices. Its goal is to maintain stability of overall software development process, so the output performance can be expected. Primitive SPC applied in manufacturing industry, successfully improving the quality of their products. But some characteristics of software, such as software development is human-intensive and innovative activities. It increases not only variability of control, but also difficulties of implementation. In this study, collate and analyze the operational framework of SPC and CMMI Level 4 through study of literature and case study with the case company-A company's practices. It contains two points, one is organization point of view, the other is methodological point of view. Organizational point of view includes stage of CMMI Level 4 and SPC implemented in the software industry, as well as how to design the organizational structure. Methodological point of view includes the steps to run SPC¡Buseful methods and tools. Methodological point of view also uses control theory to collate relevant control mechanisms. Finally, we illustrate how to integrate SPC into the company's system development life cycle. The framework can provide a reference for domestic software companies of longing for implementing CMMI Level 4 and SPC.
658

An integrated SPC/EPC system for fault diagnosis

Chang, Hsuan-Kai. January 2009 (has links)
Thesis (M.S.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2009. / Includes bibliographical references.
659

PILOT SCALE DEMONSTRATION AND EVALUATION OF INNOVATIVE NON-DESLIMED NON-CLASSIFIED GRAVITY-FED HM CYCLONE

Zhang, Yumo 01 January 2015 (has links)
Coal preparation plants are required in some cases to produce a high-grade product using a low specific gravity cut-point. For these situations, a second higher gravity separation would be desirable to generate a mid-grade product that can be utilized for electricity generation thereby maximizing coal recovery. A study was conducted to evaluate the potential of achieving efficient separations at two different density cut-points in a single stage using a three-product dense medium cyclone. Variations in density cut-point and process efficiency values were quantified as a function of the feed medium density, feed medium-to-coal ratio, and feed pressure using a three-level experimental design program. Results indicate the ability to effectively treat coal over a particle size range from 6mm to 0.15mm while achieving both low- and high-density cut-points up to 1.95 relative density. Ash content decreased from 27.98% in the feed to an average of 7.77% in the clean coal product and 25.76% in the middlings product while sulfur content was reduced from 3.87 to 2.83% in the clean coal product. The overall combustible recovery was maintained above 90% while producing clean coal products with ash and total sulfur content as low as 5.85 and 2.68%, respectively. Organic efficiency values were consistently about 95% and probable error values were in the range of 0.03 to 0.05, which indicates the ability to provide a separation performance equivalent to or better than traditional coal cleaning technologies.
660

Multivariate fault detection and visualization in the semiconductor industry

Chamness, Kevin Andrew 28 August 2008 (has links)
Not available / text

Page generated in 0.0588 seconds