• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 399
  • 234
  • 55
  • 31
  • 18
  • 10
  • 9
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • Tagged with
  • 958
  • 958
  • 236
  • 231
  • 180
  • 144
  • 140
  • 127
  • 113
  • 110
  • 98
  • 73
  • 68
  • 66
  • 64
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Dynamic Control of Serial-batch Processing Systems

Cerekci, Abdullah 14 January 2010 (has links)
This research explores how near-future information can be used to strategically control a batch processor in a serial-batch processor system setting. Specifically, improved control is attempted by using the upstream serial processor to provide near-future arrival information to the batch processor and further meet the re-sequencing requests to shorten critical products? arrival times to the batch processor. The objective of the research is to reduce mean cycle time and mean tardiness of the products being processed by the serial-batch processor system. This research first examines how mean cycle time performance of the batch processor can be improved by an upstream re-sequencing approach. A control strategy is developed by combining a look-ahead control approach with an upstream re-sequencing approach and is then compared with benchmark strategies through simulation. The experimental results indicate that the new control strategy effectively improves mean cycle time performance of the serial-batch processor system, especially when the number of product types is large and batch processor traffic intensity is low or medium. These conditions are often observed in typical semiconductor manufacturing environments. Next, the use of near-future information and an upstream re-sequencing approach is investigated for improving the mean tardiness performance of the serial-batch processor system. Two control strategies are devised and compared with the benchmark strategies through simulation. The experimental results show that the proposed control strategies improve the mean tardiness performance of the serial-batch processor system. Finally, the look-ahead control approaches that focus on mean cycle time and mean tardiness performances of the serial-batch processor system are embedded under a new control strategy that focuses on both performance measures simultaneously. It is demonstrated that look-ahead batching can be effectively used as a tool for controlling batch processors when multiple performance measures exist.
142

Multivariate Bayesian Process Control

Yin, Zhijian 01 August 2008 (has links)
Multivariate control charts are valuable tools for multivariate statistical process control (MSPC) used to monitor industrial processes and to detect abnormal process behavior. It has been shown in the literature that Bayesian control charts are optimal tools to control the process compared with the non-Bayesian charts. To use any control chart, three control chart parameters must be specified, namely the sample size, the sampling interval and the control limit. Traditionally, control chart design is based on its statistical performance. Recently, industrial practitioners and academic researchers have increasingly recognized the cost benefits obtained by applying the economically designed control charts to quality control, equipment condition monitoring, and maintenance decision-making. The primary objective of this research is to design multivariate Bayesian control charts (MVBCH) both for quality control and conditional-based maintenance (CBM) applications. Although considerable research has been done to develop MSPC tools under the assumption that the observations are independent, little attention has been given to the development of MSPC tools for monitoring multivariate autocorrelated processes. In this research, we compare the performance of the squared predication error (SPE) chart using a vector autoregressive moving average with exogenous variables (VARMAX) model and a partial least squares (PLS) model for a multivariate autocorrelated process. The study shows that the use of SPE control charts based on the VARMAX model allows rapid detection of process disturbances while reducing false alarms. Next, the economic and economic-statistical design of a MVBCH for quality control considering the control limit policy proved to be optimal by Makis(2007) is developed. The computational results illustrate that the MVBCH performs considerably better than the MEWMA chart, especially for smaller mean shifts. Sensitivity analyses further explore the impact of the misspecified out-of-control mean on the actual average cost. Finally, design of a MVBCH for CBM applications is considered using the same control limit policy structure and including an observable failure state. Optimization models for the economic and economic statistical design of the MVBCH for a 3 state CBM model are developed and comparison results show that the MVBCH performs better than recently developed CBM Chi-square chart.
143

Multivariate Bayesian Process Control

Yin, Zhijian 01 August 2008 (has links)
Multivariate control charts are valuable tools for multivariate statistical process control (MSPC) used to monitor industrial processes and to detect abnormal process behavior. It has been shown in the literature that Bayesian control charts are optimal tools to control the process compared with the non-Bayesian charts. To use any control chart, three control chart parameters must be specified, namely the sample size, the sampling interval and the control limit. Traditionally, control chart design is based on its statistical performance. Recently, industrial practitioners and academic researchers have increasingly recognized the cost benefits obtained by applying the economically designed control charts to quality control, equipment condition monitoring, and maintenance decision-making. The primary objective of this research is to design multivariate Bayesian control charts (MVBCH) both for quality control and conditional-based maintenance (CBM) applications. Although considerable research has been done to develop MSPC tools under the assumption that the observations are independent, little attention has been given to the development of MSPC tools for monitoring multivariate autocorrelated processes. In this research, we compare the performance of the squared predication error (SPE) chart using a vector autoregressive moving average with exogenous variables (VARMAX) model and a partial least squares (PLS) model for a multivariate autocorrelated process. The study shows that the use of SPE control charts based on the VARMAX model allows rapid detection of process disturbances while reducing false alarms. Next, the economic and economic-statistical design of a MVBCH for quality control considering the control limit policy proved to be optimal by Makis(2007) is developed. The computational results illustrate that the MVBCH performs considerably better than the MEWMA chart, especially for smaller mean shifts. Sensitivity analyses further explore the impact of the misspecified out-of-control mean on the actual average cost. Finally, design of a MVBCH for CBM applications is considered using the same control limit policy structure and including an observable failure state. Optimization models for the economic and economic statistical design of the MVBCH for a 3 state CBM model are developed and comparison results show that the MVBCH performs better than recently developed CBM Chi-square chart.
144

PID Control. Servo/regulation performance and robustness issues

Arrieta Orozco, Orlando 12 November 2010 (has links)
No description available.
145

Retrofitting Analysis for Improving Benefits of A/O WWTPs Considering Process Control Aspects

Cunha Machado, Vinicius 24 February 2012 (has links)
En aquest treball s'ha desenvolupat una metodologia per implementar l'eliminació biològica de fòsfor (EPBR) en les plantes de tractament d'aigües residuals urbanes (EDAR) amb configuració anòxica / òxica (A/O) dissenyades per eliminar únicament matèria orgànica (DQO) i nitrogen (N). L'objectiu és eliminar biològicament i simultàniament DQO, N i fòsfor (P) tenint en compte aspectes de control de processos i amb el millor rendiment d'operació. La metodologia proposada cerca exhaustivament un model del procés, utilitzant les dades existents de la planta per determinar els paràmetres cinètics. El model de la planta s'ha calibrat utilitzant una metodologia basada en la matriu d'informació de Fisher (FIM). Usant l'estructura del model de la planta i les noves configuracions de plantes que es proposen, s'utilitza un conjunt de criteris per identificar quina és la millor alternativa. Entre els criteris utilitzats es troben: qualitat de l'efluent, solidesa de l'estructura de control del procés, costos d'operació i costos d'inversió per a compra d'equips i per dur a terme canvis en la distribució de la planta. També s'estudia la viabilitat dels organismes acumuladors de fòsfor (PAO) i l'efecte del creixement d'aquestes espècies amb diferents estructures de control del procés. / En este trabajo se ha desarrollado una metodología para implementar la eliminación biológica de fósforo (EPBR) en las plantas de tratamiento de aguas residuales urbanas (EDAR) con configuración anóxica / óxica (A/O) diseñadas para eliminar únicamente materia orgánica (DQO) y nitrógeno (N). El objetivo es eliminar biológica y simultáneamente DQO, N y fósforo (P) teniendo en cuenta aspectos de control de procesos y con el mejor rendimiento de operación. La metodología propuesta busca exhaustivamente un modelo del proceso, utilizando los datos existentes de la planta para determinar los parámetros cinéticos. El modelo de la planta se ha calibrado utilizando una metodología basada en la matriz de información de Fisher (FIM). Usando la estructura del modelo de la planta y las nuevas configuraciones de plantas que se proponen, se utiliza un conjunto de criterios para identificar cuál es la mejor alternativa. Entre los criterios utilizados se encuentran: calidad del efluente, solidez de la estructura de control del proceso, costos de operación y costos de inversión para compra de equipos y para llevar a cabo cambios en la distribución de la planta. También se estudia la viabilidad de los organismos acumuladores de fósforo (PAO) y el efecto del crecimiento de estas especies con diferentes estructuras de control del proceso. / A methodology for retrofitting existent Anoxic/Oxic (A/O) wastewater treatment plants (WWTP) to perform the Enhanced Biological Phosphorus Removal (EPBR) in order to biologically remove organic matter (COD), nitrogen (N) and phosphorus (P) at the same time, considering process control aspects, was developed. The proposed methodology exhaustively searches a process model, using existent plant data to determine the current kinetic parameters. The plant model is calibrated using a methodology based on the Fisher Information Matrix. Using the plant model structure, new plant configurations are proposed and a set of criteria are used to identify what is the best alternative. Amongst the criteria are: the robustness of the process control structure, operating costs, investment costs to perform changes in the plant layout and equipments and the effluent quality. The feasibility of phosphorus accumulating organisms (PAO) growth and the effect of these species in the existent process control structure are also studied.
146

Simultaneous Design and Control of Chemical Plants: A Robust Modelling Approach

Ricardez Sandoval, Luis Alberto January 2008 (has links)
This research work presents a new methodology for the simultaneous design and control of chemical processes. One of the most computationally demanding tasks in the integration of process control and process design is the search for worst case scenarios that result in maximal output variability or in process variables being at their constraint limits. The key idea in the current work is to find these worst scenarios by using tools borrowed from robust control theory. To apply these tools, the closed-loop dynamic behaviour of the process to be designed is represented as a robust model. Accordingly, the process is mathematically described by a nominal linear model with uncertain model parameters that vary within identified ranges of values. These robust models, obtained from closed-loop identification, are used in the present method to test the robust stability of the process and to estimate bounds on the worst deviations in process variables in response to external disturbances. The first approach proposed to integrate process design and process control made use of robust tools that are based on the Quadratic Lyapunov Function (QLF). These tests require the identification of an uncertain state space model that is used to evaluate the process asymptotic stability and to estimate a bound (γ) on the random-mean squares (RMS) gain of the model output variability. This last bound is used to assess the worst-case process variability and to evaluate bounds on the deviations in process variables that are to be kept within constraints. Then, these robustness tests are embedded within an optimization problem that seeks for the optimal design and controller tuning parameters that minimize a user-specified cost function. Since the value of γ is a bound on one standard deviation of the model output variability, larger multiples of this value, e.g. 2γ, 3γ, were used to provide more realistic bounds on the worst deviations in process variables. This methodology (γ-based) was applied to the simultaneous design and control of a mixing tank process. Although this approach resulted in conservative designs, it posed a nonlinear constrained optimization problem that required less computational effort than that required by a Dynamic Programming approach which had been the main method previously reported in the literature. While the γ-based robust performance criterion provides a random-mean squares measure of the variability, it does not provide information on the worst possible deviation. In order to search for the worst deviation, the present work proposed a new robust variability measure based on the Structured Singular Value (SSV) analysis, also known as the μ-analysis. The calculation of this measure also returns the critical time-dependent profile in the disturbance that generates the maximum model output error. This robust measure is based on robust finite impulse response (FIR) closed-loop models that are directly identified from simulations of the full nonlinear dynamic model of the process. As in the γ-based approach, the simultaneous design and control of the mixing tank problem was considered using this new μ-based methodology. Comparisons between the γ-based and the μ-based strategies were discussed. Also, the computational time required to assess the worst-case process variability by the proposed μ-based method was compared to that required by a Dynamic Programming approach. Similarly, the expected computational burden required by this new μ-based robust variability measure to estimate the worst-case variability for large-scale processes was assessed. The results show that this new robust variability tool is computationally efficient and it can be potentially implemented to achieve the simultaneous design and control of chemical plants. Finally, the Structured Singular Value-based (μ-based) methodology was used to perform the simultaneous design and control of the Tennessee Eastman (TE) process. Although this chemical process has been widely studied in the Process Systems Engineering (PSE) area, the integration of design and control of this process has not been previously studied. The problem is challenging since it is open-loop unstable and exhibits a highly nonlinear dynamic behaviour. To assess the contributions of different sections of the TE plant to the overall costs, two optimization scenarios were considered. The first scenario considered only the reactor’s section of the TE process whereas the second scenario analyzed the complete TE plant. To study the interactions between design and control in the reactor’s section of the plant, the effect of different parameters on the resulting design and control schemes were analyzed. For this scenario, an alternative calculation of the variability was considered whereby this variability was obtained from numerical simulations of the worst disturbance instead of using the analytical μ-based bound. Comparisons between the analytical bound based strategy and the simulation based strategy were discussed. Additionally, a comparison of the computational effort required by the present solution strategy and that required by a Dynamic Programming based approach was conducted. Subsequently, the topic of parameter uncertainty was investigated. Specifically, uncertainty in the reaction rate coefficient was considered in the analysis of the TE problem. Accordingly, the optimization problem was expanded to account for a set of different values of the reaction rate constant. Due to the complexity associated with the second scenario, the effect of uncertainty in the reaction constant was only studied for the first scenario corresponding to the optimization of the reactor section. The results obtained from this research project show that Dynamic Programming requires a CPU time that is almost two orders of magnitude larger than that required by the methodology proposed here. Likewise, the consideration of uncertainty in a physical parameter within the analysis, such as the reaction rate constant in the Tennessee Eastman problem, was shown to dramatically increase the computational load when compared to the case in which there is no process parametric uncertainty in the analysis. In general, the integration of design and control within the analysis resulted in a plant that is more economically attractive than that specified by solely optimizing the controllers but leaving the design of the different units fixed. This result is particularly relevant for this research work since it justifies the need for conducting simultaneous process design and control of chemical processes. Although the application of the robust tools resulted in conservative designs, the method has been shown to be an efficient computational tool for simultaneous design and control of chemical plants.
147

Simultaneous Design and Control of Chemical Plants: A Robust Modelling Approach

Ricardez Sandoval, Luis Alberto January 2008 (has links)
This research work presents a new methodology for the simultaneous design and control of chemical processes. One of the most computationally demanding tasks in the integration of process control and process design is the search for worst case scenarios that result in maximal output variability or in process variables being at their constraint limits. The key idea in the current work is to find these worst scenarios by using tools borrowed from robust control theory. To apply these tools, the closed-loop dynamic behaviour of the process to be designed is represented as a robust model. Accordingly, the process is mathematically described by a nominal linear model with uncertain model parameters that vary within identified ranges of values. These robust models, obtained from closed-loop identification, are used in the present method to test the robust stability of the process and to estimate bounds on the worst deviations in process variables in response to external disturbances. The first approach proposed to integrate process design and process control made use of robust tools that are based on the Quadratic Lyapunov Function (QLF). These tests require the identification of an uncertain state space model that is used to evaluate the process asymptotic stability and to estimate a bound (γ) on the random-mean squares (RMS) gain of the model output variability. This last bound is used to assess the worst-case process variability and to evaluate bounds on the deviations in process variables that are to be kept within constraints. Then, these robustness tests are embedded within an optimization problem that seeks for the optimal design and controller tuning parameters that minimize a user-specified cost function. Since the value of γ is a bound on one standard deviation of the model output variability, larger multiples of this value, e.g. 2γ, 3γ, were used to provide more realistic bounds on the worst deviations in process variables. This methodology (γ-based) was applied to the simultaneous design and control of a mixing tank process. Although this approach resulted in conservative designs, it posed a nonlinear constrained optimization problem that required less computational effort than that required by a Dynamic Programming approach which had been the main method previously reported in the literature. While the γ-based robust performance criterion provides a random-mean squares measure of the variability, it does not provide information on the worst possible deviation. In order to search for the worst deviation, the present work proposed a new robust variability measure based on the Structured Singular Value (SSV) analysis, also known as the μ-analysis. The calculation of this measure also returns the critical time-dependent profile in the disturbance that generates the maximum model output error. This robust measure is based on robust finite impulse response (FIR) closed-loop models that are directly identified from simulations of the full nonlinear dynamic model of the process. As in the γ-based approach, the simultaneous design and control of the mixing tank problem was considered using this new μ-based methodology. Comparisons between the γ-based and the μ-based strategies were discussed. Also, the computational time required to assess the worst-case process variability by the proposed μ-based method was compared to that required by a Dynamic Programming approach. Similarly, the expected computational burden required by this new μ-based robust variability measure to estimate the worst-case variability for large-scale processes was assessed. The results show that this new robust variability tool is computationally efficient and it can be potentially implemented to achieve the simultaneous design and control of chemical plants. Finally, the Structured Singular Value-based (μ-based) methodology was used to perform the simultaneous design and control of the Tennessee Eastman (TE) process. Although this chemical process has been widely studied in the Process Systems Engineering (PSE) area, the integration of design and control of this process has not been previously studied. The problem is challenging since it is open-loop unstable and exhibits a highly nonlinear dynamic behaviour. To assess the contributions of different sections of the TE plant to the overall costs, two optimization scenarios were considered. The first scenario considered only the reactor’s section of the TE process whereas the second scenario analyzed the complete TE plant. To study the interactions between design and control in the reactor’s section of the plant, the effect of different parameters on the resulting design and control schemes were analyzed. For this scenario, an alternative calculation of the variability was considered whereby this variability was obtained from numerical simulations of the worst disturbance instead of using the analytical μ-based bound. Comparisons between the analytical bound based strategy and the simulation based strategy were discussed. Additionally, a comparison of the computational effort required by the present solution strategy and that required by a Dynamic Programming based approach was conducted. Subsequently, the topic of parameter uncertainty was investigated. Specifically, uncertainty in the reaction rate coefficient was considered in the analysis of the TE problem. Accordingly, the optimization problem was expanded to account for a set of different values of the reaction rate constant. Due to the complexity associated with the second scenario, the effect of uncertainty in the reaction constant was only studied for the first scenario corresponding to the optimization of the reactor section. The results obtained from this research project show that Dynamic Programming requires a CPU time that is almost two orders of magnitude larger than that required by the methodology proposed here. Likewise, the consideration of uncertainty in a physical parameter within the analysis, such as the reaction rate constant in the Tennessee Eastman problem, was shown to dramatically increase the computational load when compared to the case in which there is no process parametric uncertainty in the analysis. In general, the integration of design and control within the analysis resulted in a plant that is more economically attractive than that specified by solely optimizing the controllers but leaving the design of the different units fixed. This result is particularly relevant for this research work since it justifies the need for conducting simultaneous process design and control of chemical processes. Although the application of the robust tools resulted in conservative designs, the method has been shown to be an efficient computational tool for simultaneous design and control of chemical plants.
148

Dynamic Control of Serial-batch Processing Systems

Cerekci, Abdullah 14 January 2010 (has links)
This research explores how near-future information can be used to strategically control a batch processor in a serial-batch processor system setting. Specifically, improved control is attempted by using the upstream serial processor to provide near-future arrival information to the batch processor and further meet the re-sequencing requests to shorten critical products? arrival times to the batch processor. The objective of the research is to reduce mean cycle time and mean tardiness of the products being processed by the serial-batch processor system. This research first examines how mean cycle time performance of the batch processor can be improved by an upstream re-sequencing approach. A control strategy is developed by combining a look-ahead control approach with an upstream re-sequencing approach and is then compared with benchmark strategies through simulation. The experimental results indicate that the new control strategy effectively improves mean cycle time performance of the serial-batch processor system, especially when the number of product types is large and batch processor traffic intensity is low or medium. These conditions are often observed in typical semiconductor manufacturing environments. Next, the use of near-future information and an upstream re-sequencing approach is investigated for improving the mean tardiness performance of the serial-batch processor system. Two control strategies are devised and compared with the benchmark strategies through simulation. The experimental results show that the proposed control strategies improve the mean tardiness performance of the serial-batch processor system. Finally, the look-ahead control approaches that focus on mean cycle time and mean tardiness performances of the serial-batch processor system are embedded under a new control strategy that focuses on both performance measures simultaneously. It is demonstrated that look-ahead batching can be effectively used as a tool for controlling batch processors when multiple performance measures exist.
149

Reducing groupthink problem using activity control in mobile collaborative learning environments

Chen, Sih-ying 28 July 2008 (has links)
The learner-centered concept has become an important trend in educational field; collaborative learning is exactly the realization of such a concept. Many studies have pointed out that collaborative learning is better than traditional competitive and personal learning. However, there still exist some practical issues while applying collaborative learning, groupthink is one of them. Although researchers have developed many mechanisms for solving the practical issues of collaborative learning, groupthink can not be eliminated effectively. Because of groupthink, learning groups may jump to coherent conclusions quickly before a fully knowledge sharing and constructing process is being executed. This is against the spirit of Social Constructionism which is the essential theory of collaborative learning and reduces the benefits of collaborative leaning. Learning process control is an effective mechanism for reducing the groupthink effect. However, learning process control mechanism usually cannot work as expected because of the inappropriate guide of instructors or different leading styles of group leaders. Therefore, the aim of this study is to develop a system with the support of learning process control mechanism using mobile devices is developed to ensure the mechanism will be properly executed. The results show that self-censorship, which is one of the symptoms of groupthink, can be reduced significantly if learning process control mechanism is executed on mobile devices compared to the execution on traditional worksheets. Moreover, the groups using mobile devices support for executing the learning process control, the four symptoms of groupthink can be greatly reduced compared to the groups without any learning process control and learners could have better learning attitude and interactive level. Therefore, this study proved that groupthink can be reduced by having learning process control mechanism executed on mobile devices. Finally, some future research topics are proposed based on the research results.
150

Design and analysis of multivariable predictive control applied to an oil-water-gas separator a polynomial approach /

Nunes, Giovani Cavalcanti, January 2001 (has links) (PDF)
Thesis (Ph. D.)--University of Florida, 2001. / Title from first page of PDF file. Document formatted into pages; contains viii, 118 p.; also contains graphics. Vita. Includes bibliographical references (p. 115-117).

Page generated in 0.0425 seconds