• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • 1
  • Tagged with
  • 19
  • 19
  • 19
  • 11
  • 9
  • 8
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Real-time process control and simulation for chemical mix facility

Liu, Pi-Shien, 1960- January 1988 (has links)
The purpose of this study is to design a real-time control and simulation system for a chemical mix facility. A simulation circuit board and software simulation in an IBM personal computer emulated the real-time chemical mix facility. A second personal computer controlled the plant. The parallel port in the IBM PC computer serves as a communication path between the controlled and controlling system. Results show that the simulation can assist the design of the actual system.
2

Development towards intelligent design for assembly

Hsu, Hung-Yao January 2001 (has links)
This thesis addresses research towards the development of an intelligent design for assembly evaluation system (IDFAES) based on the design for assembly (DFA) principels. The research project aimed to enhance the capability of existing DFA methodologies in order to support activities such as redesign, design modification and assembly planning during the product development cycle.
3

Development towards intelligent design for assembly

Hsu, Hung-Yao January 2001 (has links)
This thesis addresses research towards the development of an intelligent design for assembly evaluation system (IDFAES) based on the design for assembly (DFA) principels. The research project aimed to enhance the capability of existing DFA methodologies in order to support activities such as redesign, design modification and assembly planning during the product development cycle.
4

Collaborative Response to Disruption Propagation (CRDP)

Phuc V Nguyen (8779382) 01 May 2020 (has links)
<p><a>Disruptive events during recent decades have highlighted the vulnerabilities of complex systems of systems to disruption propagation: </a>Disruptions that start in one part of a system and can propagate to other parts. Such examples include: Fire spreading in building complexes and forests; plant/crop diseases in agricultural production systems; propagating malware in computer networks and cyber-physical systems; and disruptions in supply networks. The impacts of disruption propagation are devastating, with fire causing annual US$23 billion loss in the US alone, plant diseases/crop reducing agricultural productivity 20% to 40% annually, and computer malware causing up to US$2.3 billion loss per event (as a conservative estimate). These problems, the response to disruption propagation (<a>RDP</a>) problems, are challenging due to the involvement of different problem aspects and their complex dynamics. To better design and control the responses to disruption propagation, a general framework and problem-solving guideline for the RDP problems is necessary.<br></p><p><br></p> <p> </p> <p>To address the aforementioned challenge, this research develops the Collaborative Response to Disruption Propagation (<a>CRDP</a>) unifying framework to classify, categorize, and characterize the different aspects of the RDP problems. The CRDP framework allows analogical reasoning across the different problem contexts, such as the examples mentioned above. Three main components applicable to the investigate RDP problems are identified and characterized: (1) The client system as the victims; (2) The response mechanisms as the rescuers/protectors; and (3) The disruption propagation as the aggressors/attackers. This allows further characterization of the complex interactions between the components, which augments the design and control decisions for the response mechanisms to better respond to the disruptions. The new Covering Lines of Collaboration (<a>CLOC</a>) principle, consisting of three guidelines, is developed to analyze the system state and guide the response decisions. The first CLOC guideline recommends the network modeling of potential disruption propagation directions, creating a complex network for better situation awareness and analysis. The second CLOC guideline recommends the analysis of the propagation-restraining effects due to the existence of the response mechanisms, and utilizing this interaction in optimizing response decisions. The third CLOC guideline recommends the development of collaboration protocols between the response decisions to maximize the coverage of response against disruption propagation.</p><p><br></p> <p> </p> <p>The CRDP framework and the CLOC principle are validated with three RDP case studies: (1) Detection of unknown disruptions; (2) Strategic prevention of unexpected disruptions; (3) Teaming and coordination of repair agents against recurring disruptions. Formulations, analytics, and protocols specific to each case are developed. TIE/CRDP, a new version of the Teamwork Integration Evaluator (<a>TIE</a>) software, is developed to simulate the complex interactions and dynamics of the CRDP components, the response decision protocols, and their performance. The evaluator is capable of simulating and evaluating the complex interactions and dynamics of the CRDP components and the response decision protocols. <a>Experiment results indicate that advanced CLOC-based decisions significantly outperform the baseline and less advanced protocols for all three cases, with performance superiority of 9.7-32.8% in case 1; 31.1%-56.6% in case 2; 2.1%-12.1% for teaming protocols, and at least 50% for team coordination protocols in case 3.</a></p>
5

Real-Time Monitoring of Powder Mass Flowrates for MPC/PID Control of a Continuous Direct Compaction Tablet Manufacturing Process

Yan-Shu Huang (9175667) 30 July 2020 (has links)
<div>To continue the shift from batch operations to continuous operations for a wider range of products, advances in real-time process management (RTPM) are necessary. The key requirements for effective RTPM are to have reliable real-time data of the critical process parameters (CPP) and critical quality attributes (CQA) of the materials being processed, and to have robust control strategies for the rejection of disturbances and setpoint tracking.</div><div><br></div><div>Real-time measurements are necessary for capturing process dynamics and implement feedback control approaches. The mass flow rate is an additional important CPP in continuous manufacturing compared to batch processing. The mass flow rate can be used to control the composition and content uniformity of drug products as well as an indicator of whether the process is in a state of control. This is the rationale for investigating real-time measurement of mass flow of particulate streams. Process analytical technology (PAT) tools are required to measure particulate flows of downstream unit operations, while loss-in-weight (LIW) feeders only provide initial upstream flow rates. A novel capacitance-based sensor, the ECVT sensor, has been investigated in this study and demonstrates the ability to effectively measure powder mass flow rates in the downstream equipment.</div><div><br></div><div>Robust control strategies can be utilized to respond to variations and disturbances in input material properties and process parameters, so CQAs of materials/products can be maintained and the amount of off-spec production can be reduced. The hierarchical control system (Level 0 equipment built-in control, Level 1 PAT based PID control and Level 2 optimization-based model predictive control) was applied in the pilot plant at Purdue University and it was demonstrated that the use of active process control allows more robust continuous process operation under different risk scenarios compared to a more rigid open-loop process operation within predefined design space. With the aid of mass flow sensing, the control framework becomes more robust in mitigating the effects of upstream disturbances on product qualities. For example, excursions in the mass flow from an upstream unit operation, which could force a shutdown of the tablet press and/or produce off-spec tablets, can be prevented by proper control and monitoring of the powder flow rate entering the tablet press hopper.</div><div><br></div><div>In this study, the impact of mass flow sensing on the control performance of a direct compaction line is investigated by using flowsheet modeling implemented in MATLAB/Simulink to examine the control performance under different risk scenarios and effects of data sampling (sampling time, measurement precision). Followed by the simulation work, pilot plant studies are reported in which the mass flow sensor is integrated into the tableting line at the exit of the feeding-and-blending system and system performance data is collected to verify the effects of mass flow sensing on the performance of the overall plant-wide supervisory control.</div>
6

HEIGHT PROFILE MODELING AND CONTROL OF INKJET 3D PRINTING

Yumeng Wu (13960689) 14 October 2022 (has links)
<p>Among all additive manufacturing processes, material jetting, or inkjet 3D printing, builds the product similar to the traditional inkjet printing, either by drop-on-demand or continuous printing. Aside from the common advantages as other additive manufacturing methods, it can achieve higher resolution than other additive manufacturing methods. Combining its ability to accept a wide range of functional inks, inkjet 3D printing is predominantly used in pharmaceutical and biomedical applications. A height profile model is necessary to achieve better estimation of the geometry of a printed product. Numerical height profile models have been documented that can estimate the inkjet printing process from when the droplet hits the substrate till fully cured. Although they can estimate height profiles relatively accurately, these models generally take a long time to compute. A simplified model that can achieve sufficient accuracy while reducing computational complexity is needed for real-time process control. In this work, a layer-to-layer height propagation model that aims to balance computational complexity and model accuracy is proposed and experimentally validated. The model consists of two sub-models where one is dedicated to multi-layer line printing and the other is more broadly applicable for multi-layer 2D patterns. Both models predict the height profile of drops through separate volume and area layer-to-layer propagation. The layer-to-layer propagation is based on material flow and volume conservation. The models are experimentally validated on an experimental inkjet 3D printing system equipped with a heated piezoelectric dispenser head made by Microdrop. There are notable similarities between inkjet 3D printing and inkjet image printing, which has been studied extensively to improve color printing quality. Image processing techniques are necessary to convert nearly continuous levels of color intensities to binary printing map while satisfying the human visual system at the same time. It is reasonable to leverage such image processing techniques to improve the quality of inkjet 3D printed products, which might be more effective and efficient. A framework is proposed to adapt image processing techniques for inkjet 3D printing. Standard error diffusion method is chosen as a demonstration of the framework to be adapted for inkjet 3D printing and this adaption is experimentally validated. The adapted error diffusion method can improve the printing quality in terms of geometry integrity with low demand on computation power. Model predictive control has been widely used for process control in various industries. With a carefully designed cost function, model predictive control can be an effective tool to improve inkjet 3D printing. While many researchers utilized model predictive control to indirectly improves functional side of the printed products, geometry control is often overlooked. This is possibly due to the lack of high quality height profile models for inkjet 3D printing for real-time control. Height profile control of inkjet 3D printing can be formulated as a constrained non-linear model predictive control problem. The input to the printing system is always constrained, as droplet volume not only is bounded but also cannot be continuously adjusted due to the limitation of the printhead.  A specific cost function is proposed to account for the geometry of both the final printed product and the intermediate layers better. The cost function is further adjusted for the inkjet 3D printing system to reduce memory usage for larger print geometries by introducing sparse matrix and scaler cost weights. Two patterns with different parameter settings are simulated using model predictive controller. The simulated results show a consistent improvement over open-loop prints. Experimental validation is also performed on both a bi-level pattern and a P pattern, same as that printed with adapted error diffusion for inkjet 3D printing. The model predictive controlled printing outperforms the open-loop printing. In summary, a set of layer-to-layer height propagation profile models for inkjet 3D printing are proposed and experimentally validated. A framework to adapt error diffusion to improve inkjet 3D printing is proposed and validated experimentally. Model predictive control can also improve geometric integrity of inkjet 3D printing with a carefully designed cost function to address memory usage. It is also experimentally validated.</p>
7

Human-in-the-loop of Cyber Physical Agricultural Robotic Systems

Maitreya Sreeram (9706730) 15 December 2020 (has links)
The onset of Industry 4.0 has provided considerable benefits to Intelligent Cyber-Physical Systems (ICPS), with technologies such as internet of things, wireless sensing, cognitive computing and artificial intelligence to improve automation and control. However, with increasing automation, the “human” element in industrial systems is often overlooked for the sake of standardization. While automation aims to redirect the workload of human to standardized and programmable entities, humans possess qualities such as cognitive awareness, perception and intuition which cannot be automated (or programmatically replicated) but can provide automated systems with much needed robustness and sustainability, especially in unstructured and dynamic environments. Incorporating tangible human skills and knowledge within industrial environments is an essential function of “Human-in-the-loop” (HITL) Systems, a term for systems powerfully augmented by different qualities of human agents. The primary challenge, however, lies in the realistic modelling and application of these qualities; an accurate human model must be developed, integrated and tested within different cyber-physical workflows to 1) validate the assumed advantages, investments and 2) ensure optimized collaboration between entities. Agricultural Robotic Systems (ARS) are an example of such cyber-physical systems (CPS) which, in order to reduce reliance on traditional human-intensive approaches, leverage sensor networks, autonomous robotics and vision systems and for the early detection of diseases in greenhouse plants. Complete elimination of humans from such environments can prove sub-optimal given that greenhouses present a host of dynamic conditions and interactions which cannot be explicitly defined or managed automatically. Supported by efficient algorithms for sampling, routing and search, HITL augmentation into ARS can provide improved detection capabilities, system performance and stability, while also reducing the workload of humans as compared to traditional methods. This research thus studies the modelling and integration of humans into the loop of ARS, using simulation techniques and employing intelligent protocols for optimized interactions. Human qualities are modelled in human “classes” within an event-based, discrete time simulation developed in Python. A logic controller based on collaborative intelligence (HUB-CI) efficiently dictates workflow logic, owing to the multi-agent and multi-algorithm nature of the system. Two integration hierarchies are simulated to study different types of integrations of HITL: Sequential, and Shared Integration. System performance metrics such as costs, number of tasks and classification accuracy are measured and compared for different collaboration protocols within each hierarchy, to verify the impact of chosen sampling and search algorithms. The experiments performed show the statistically significant advantages of HUB-CI based protocol over traditional protocols in terms of collaborative task performance and disease detectability, thus justifying added investment due to the inclusion of HITL. The results also discuss the competitive factors between both integrations, laying out the relative advantages and disadvantages and the scope for further research. Improving human modelling and expanding the range of human activities within the loop can help to improve the practicality and accuracy of the simulation in replicating an HITL-ARS. Finally, the research also discusses the development of a user-interface software based on ARS methodologies to test the system in the real-world.<br>
8

SCANS Framework: Simulation of CUAS Networks and Sensors

Austin Riegsecker (8561289) 15 December 2020 (has links)
Counter Unmanned Aerial System (CUAS) security systems have unrealistic performance expectations hyped on marketing and idealistic testing environments. By developing an agent-based model to simulate these systems, an average performance metric can be obtained, thereby providing better representative values of true system performance.<br><br>Due to high cost, excessive risk, and exponentially large parameter possibilities, it is unrealistic to test a CUAS system for optimal performance in the real world. Agent-based simulation can provide the necessary variability at a low cost point and allow for numerous parametric possibilities to provide actionable output from the CUAS system. <br><br>This study describes and documents the Simulation of CUAS Networks and Sensors (SCANS) Framework in a novel attempt at developing a flexible modeling framework for CUAS systems based on device parameters. The core of the framework rests on sensor and communication device agents. These sensors, including Acoustic, Radar, Passive Radio Frequency (RF), and Camera, use input parameters, sensor specifications, and UAS specifications to calculate such values as the sound pressure level, received signal strength, and maximum viewable distance. The communication devices employ a nearest-neighbor routing protocol to pass messages from the system which are then logged by a command and control agent. <br><br>This framework allows for the flexibility of modeling nearly any CUAS system and is designed to be easily adjusted. The framework is capable of reporting true positives, true negatives, and false negatives in terms of UAS detection. For testing purposes, the SCANS Framework was deployed in AnyLogic and models were developed based on existing, published, empirical studies of sensors and detection UAS.<br>
9

OPTIMIZATION TECHNIQUES FOR PHARMACEUTICAL MANUFACTURING AND DESIGN SPACE ANALYSIS

Daniel Joseph Laky (13120485) 21 July 2022 (has links)
<p>In this dissertation, numerical analysis frameworks and software tools for digital design of process systems are developed. More specifically, these tools have been focused on digital design within the pharmaceutical manufacturing space. Batch processing represents the traditional and still predominant pathway to manufacture pharmaceuticals in both the drug substance and drug product spaces. Drug substance processes start with raw materials or precursors to produce an active pharmaceutical ingredient (API) through synthesis and purification. Drug product processes take this pure API in powder form, add excipients, and process the powder into consumer doses such as capsules or tablets.  Continuous manufacturing has allowed many other chemical industries to take advantage of real-time process management through process control, process optimization, and real-time detection of off-spec material. Also, the possibility to reduce total cleaning time of units and encourage green chemistry through solvent reduction or recycling make continuous manufacturing an attractive alternative to batch manufacturing. However, to fully understand and take advantage of real-time process management, digital tools are required, both as soft sensors during process control or during process design and optimization.  Since the shift from batch to continuous manufacturing will proceed in stages, processes will likely adopt both continuous and batch unit operations in the same process, which we will call {\em hybrid} pharmaceutical manufacturing routes. Even though these processes will soon become common in the industry, digital tools that address comparison of batch, hybrid, and continuous manufacturing routes in the pharmaceutical space are lacking. This is especially true when considering hybrid routes. For this reason, PharmaPy, an open-source tool for pharmaceutical process development, was created to address rapid in-silico design of hybrid pharmaceutical processes.  Throughout this work, the focus is on analyzing alternative operating modes within the drug substance manufacturing context. First, the mathematical models for PharmaPy's synthesis, crystallization, and filtration units are discussed. Then, the simulation capabilities of PharmaPy are highlighted, showcasing dynamic simulation of both fully continuous and hybrid processes. However, the technical focus of the work as a whole is primarily on optimization techniques for pharmaceutical process design. Thus, many derivative-free optimization frameworks for simulation-optimization were constructed and utilized with PharmaPy performing simulations of pharmaceutical processes.  The timeline of work originally began with derivative-based methods to solve mixed-integer programs (MIP) for water network sampling and security, as well as nonlinear programs (NLPs) and some mixed-integer nonlinear programs (MINLPs) for design space and feasibility analysis. Therefore, a method for process design that combines both the ease of implementation from a process simulator (PharmaPy) with the computational performance of derivative-based optimization was implemented. Recent developments in Pyomo through the PyNumero package allow callbacks to an input-output or black-box model while using {\sc Ipopt} as a derivative-based solver through the cyipopt interface. Using this approach, it was found that using a PharmaPy simulation as a black box within a derivative-based solver resulted in quicker solve times when compared with traditional derivative-free optimization strategies, and offers a much quicker implementation strategy than using a simultaneous equation-oriented algebraic definition of the problem.  Also, uncertainty exists in virtually all process systems. Traditionally, uncertainty is analyzed through sampling approaches such as Monte Carlo simulation. These sampling approaches quickly become computational obstacles as problem scale increases. In the 1980s, chemical plant design under uncertainty through {\em flexibility analysis} became an option for explicitly considering model uncertainty using mathematical programming. However, such formulations provide computational obstacles of their own as most process models produce challenging MINLPs under the flexibility analysis framework.  Specifically when considering pharmaceutical processes, recent initiatives by the FDA have peaked interest in flexibility analysis because of the so called {\em design space}. The design space is the region for which critical quality attributes (CQAs) may be guaranteed over a set of interactions between the inputs and process parameters. Since uncertainty is intrinsic to such operations, industry is interested in guaranteeing that CQAs hold with a set confidence level over a given operating region. In this work, the {\em probabilistic design space} defined by these levels of confidence is presented to address the computational advantages of using a fully model-based flexibility analysis framework instead of a Monte Carlo sampling approach. From the results, it is seen that using the flexibility analysis framework decreased design space identification time by more than two orders of magnitude.  Given implementation difficulty with new digital tools for both students and professionals, educational material was developed for PharmaPy and was presented as part of a pharmaceutical API process development course at Purdue. The students were surveyed afterward and many of the students found the framework to be approachable through the use of Jupyter notebooks, and would consider using PharmaPy and Python for pharmaceutical modeling and data analysis in the future, respectively.  Through software development and the development of numerical analysis frameworks, digital design of pharmaceutical processes has expanded and become more approachable. The incorporation of rigorous simulations under process uncertainty promotes the use of digital tools in regulatory filings and reduces unnecessary process development costs using model-based design. Examples of these improvements are evident through the development of PharmaPy, a simulation-optimization framework using PharmaPy, and flexibility analysis tools. These tools resulted in a computational benefit of 1 to 2 orders of magnitude when compared to methods used in practice and in some cases reduce the modeling time required to determine optimal operating conditions, or the design space of a pharmaceutical manufacturing process.</p>
10

Process Intensification of Chemical Systems Towards a Sustainable Future

Zewei Chen (13161915) 27 July 2022 (has links)
<p>Cutting greenhouse gas emissions to as close to zero as possible, or ”net-zero”, may be the biggest sustainability goal to be achieved in the next 30 years. While chemical engineering evolved against the backdrop of an abundant supply of fossil resources for chemical production and energy, renewable energy resources such as solar and wind will find more usage in the future. This thesis work develops new concepts, methods and algorithms to identify and synthesize process schemes to address multiple aspects towards sustainable chemical and energy systems. Shale gas can serve as both energy resource and chemical feedstock for the transition period towards a sustainable economy, and has the potential to be a carbon source for the long term. The past two decades have seen increasing natural gas flaring and venting due to the lack of transforming or transportation infrastructure in emerging shale gas producing regions. To reduce carbon emission and wastage of shale resources, an innovative process hierarchy is identified for the valorization of natural gas liquids from shale gas at medium to small scale near the wellhead. This paradigm shift fundamentally changes the sequencing of various separation and reaction steps and results in dramatically simplified and intensified process flowsheets. The resulting processes could achieve over 20% lower capital with a higher recovery of products. Historically, heat energy is supplied to chemical plants by burning fossil resources. However, in future, with the emphasis on greenhouse gas reduction, renewable energy resources will find more usage. Renewable electricity from photovoltaic and wind has now become competitive with the electricity from fossil resources. Therefore, a major challenge for chemical engineering processes is how to use renewable electricity efficiently within a chemical plant and eliminate any carbon dioxide release from chemical plants. We introduce several decarbonization flowsheets for the process to first convert natural gas liquids (NGLs) to  mainly ethylene in an energy intensive dehydrogenation reactor and subsequent conversion of ethylene into value-added and easy-to-transport liquid fuels. </p> <p><br></p> <p>Molecular separations are needed across many types of industries, including oil and gas, food, pharmaceutical, and chemical industries. In a chemical plant, 40%–60% of energy and capital cost is tied to separation processes. For widespread use of membrane-based processes for high recovery and purity products from gaseous and liquid mixtures on an industrial scale, availability of models that allow the use of membrane cascades at their optimal operating modes is desirable towards sustainable separation systems. This will also enable proper comparison of membrane performance vis-a-vis other competing separation technologies. However, such a model for multicomponent fluid separation has been missing from the literature. We have developed an MINLP global optimization algorithm that guarantees the identification of minimum power consumption of multicomponent membrane cascades. The proposed optimization algorithm is implemented in GAMS and is demonstrated to have the capability to solve up to 4-component and 5-stage membrane cascades via BARON solver, which is significantly more advantageous than the state-of-the-art processes. The model is currently being further developed to include optimization of total cost including capital. Such a model holds the promise to be useful for the development in implementation of energy-efficient separation plants with least carbon footprint. This thesis work also addresses important topics in separation including dividing wall columns and water desalination. </p>

Page generated in 0.1471 seconds