Spelling suggestions: "subject:"digitaltwin"" "subject:"digitaltwins""
151 |
Job shop smart manufacturing scheduling by deep reinforcement learning for Industry 4.0Serrano Ruiz, Julio César 24 January 2025 (has links)
Tesis por compendio / [ES] El paradigma de la Industria 4.0 (I4.0) gravita en gran medida sobre el potencial de las tecnologías de la información y la comunicación (TIC) para mejorar la competitividad y sostenibilidad de las industrias. El concepto de Smart Manufacturing Scheduling (SMS) surge y se inspira de ese potencial. SMS, como estrategia de transformación digital, aspira a optimizar los procesos industriales mediante la aplicación de tecnologías como el gemelo digital o digital twin (DT), el modelo de gestión zero-defect manufacturing (ZDM), y el aprendizaje por refuerzo profundo o deep reinforcement learning (DRL), con el propósito final de orientar los procesos de programación de operaciones hacia una automatización adaptativa en tiempo real y una reducción de las perturbaciones en los sistemas de producción.
SMS se basa en cuatro principios de diseño del espectro I4.0: automatización, autonomía, capacidad de acción en tiempo real e interoperabilidad. A partir de estos principios clave, SMS combina las capacidades de la tecnología DT para simular, analizar y predecir; la del modelo ZDM para prevenir perturbaciones en los sistemas de planificación y control de la producción; y la del enfoque de modelado DRL para mejorar la toma de decisiones en tiempo real. Este enfoque conjunto orienta los procesos de programación de operaciones hacia una mayor eficiencia y, con ello, hacia un mayor rendimiento y resiliencia del sistema productivo.
Esta investigación emprende, en primer lugar, una revisión exhaustiva del estado del arte sobre SMS. Con la revisión efectuada como referencia, la investigación plantea un modelo conceptual de SMS como estrategia de transformación digital en el contexto del proceso de programación del taller de trabajos o job shop. Finalmente, la investigación propone un modelo basado en DRL para abordar la implementación de los elementos clave del modelo conceptual: el DT del taller de trabajos y el agente programador. Los algoritmos que integran este modelo se han programado en Python y han sido validados contra varias de las más conocidas reglas heurísticas de prioridad. El desarrollo del modelo y los algoritmos supone una contribución académica y gerencial en el área de la planificación y control de la producción. / [CA] El paradigma de la Indústria 4.0 (I4.0) gravita en gran mesura sobre el potencial de les tecnologies de la informació i la comunicació (TIC) per millorar la competitivitat i la sostenibilitat de les indústries. El concepte d'smart manufacturing scheduling (SMS) sorgeix i inspira a partir d'aquest potencial. SMS, com a estratègia de transformació digital, aspira a optimitzar els processos industrials mitjançant l'aplicació de tecnologies com el bessó digital o digital twin (DT), el model de gestió zero-defect manufacturing (ZDM), i l'aprenentatge per reforçament profund o deep reinforcement learning (DRL), amb el propòsit final dorientar els processos de programació doperacions cap a una automatització adaptativa en temps real i una reducció de les pertorbacions en els sistemes de producció.
SMS es basa en quatre principis de disseny de l'espectre I4.0: automatització, autonomia, capacitat d¿acció en temps real i interoperabilitat. A partir d'aquests principis clau, SMS combina les capacitats de la tecnologia DT per simular, analitzar i predir; la del model ZDM per prevenir pertorbacions en els sistemes de planificació i control de la producció; i la de de l'enfocament de modelatge DRL per millorar la presa de decisions en temps real. Aquest enfocament conjunt orienta els processos de programació d'operacions cap a una eficiència més gran i, amb això, cap a un major rendiment i resiliència del sistema productiu.
Aquesta investigació emprèn, en primer lloc, una exhaustiva revisió de l'estat de l'art sobre SMS. Amb la revisió efectuada com a referència, la investigació planteja un model conceptual de SMS com a estratègia de transformació digital en el context del procés de programació del taller de treballs o job shop. Finalment, la investigació proposa un model basat en DRL per abordar la implementació dels elements claus del model conceptual: el DT del taller de treballs i l'agent programador. Els algorismes que integren aquest model s'han programat a Python i han estat validats contra diverses de les més conegudes regles heurístiques de prioritat. El desenvolupament del model i els algorismes suposa una contribució a nivell acadèmic i gerencial a l'àrea de la planificació i control de la producció. / [EN] The Industry 4.0 (I4.0) paradigm relies, to a large extent, on the potential of information and communication technologies (ICT) to improve the competitiveness and sustainability of industries. The smart manufacturing scheduling (SMS) concept arises and draws inspiration from this potential. As a digital transformation strategy, SMS aims to optimise industrial processes through the application of technologies, such as the digital twin (DT), the zero-defect manufacturing (ZDM) management model and deep reinforcement learning (DRL), for the ultimate purpose of guiding operations scheduling processes towards real-time adaptive automation and to reduce disturbances in production systems.
SMS is based on four design principles of the I4.0 spectrum: automation, autonomy, real-time capability and interoperability. Based on these key principles, SMS combines the capabilities of the DT technology to simulate, analyse and predict; with the ZDM model, to prevent disturbances in production planning and control systems; by the DRL modelling approach, to improve real-time decision making. This joint approach orients operations scheduling processes towards greater efficiency and, with it, a better performing and more resilient production system.
This research firstly undertakes a comprehensive review of the state of the art on SMS. By taking the review as a reference, the research proposes a conceptual model of SMS as a digital transformation strategy in the job shop scheduling process context. Finally, it proposes a DRL-based model to address the implementation of the key elements of the conceptual model: the job shop DT and the scheduling agent. The algorithms that integrate this model have been programmed in Python and validated against several of the most well-known heuristic priority rules. The development of the model and algorithms is an academic and managerial contribution in the production planning and control area. / This thesis was developed with the support of the Research Centre on
Production Management and Engineering (CIGIP) of the Universitat Politècnica
de València and received funding from: the European Union H2020 programme
under grant agreement No. 825631, “Zero Defect Manufacturing Platform
(ZDMP)”; the European Union H2020 programme under grant agreement No.
872548, "Fostering DIHs for Embedding Interoperability in Cyber-Physical
Systems of European SMEs (DIH4CPS)"; the European Union H2020
programme under grant agreement No. 958205, “Industrial Data Services for
Quality Control in Smart Manufacturing (i4Q)”; the European Union Horizon
Europe programme under grant agreement No. 101057294, “AI Driven
Industrial Equipment Product Life Cycle Boosting Agility, Sustainability and
Resilience” (AIDEAS); the Spanish Ministry of Science, Innovation and
Universities under grant agreement RTI2018-101344-B-I00, "Optimisation of
zero-defects production technologies enabling supply chains 4.0 (CADS4.0)";
the Valencian Regional Government, in turn funded from grant RTI2018-
101344-B-I00 by MCIN/AEI/10.13039/501100011033 and by “ERDF A way of
making Europe”, "Industrial Production and Logistics optimization in Industry
4.0" (i4OPT) (Ref. PROMETEO/2021/065); and the grant PDC2022-133957-
I00, “Validation of transferable results of optimisation of zero-defect enabling
production technologies for supply chain 4.0” (CADS4.0-II) funded by
MCIN/AEI/10.13039/501100011033 and by European Union Next
GenerationEU/PRTR. / Serrano Ruiz, JC. (2024). Job shop smart manufacturing scheduling by deep reinforcement learning for Industry 4.0 [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/202871 / Compendio
|
152 |
Product Requirements Management for Digital Product Passports : A Case Study on Object-Centric Information Modeling & Application of Requirements / Produktkravhantering för Digitala Produktpass : En Fallstudie om Objektcentrerad Informationsmodellering & Applicering av KravNettelbladt, Max, Stojanovski, Kevin January 2024 (has links)
This thesis provides insights into the feasibility of managing product requirements in a concept leveraging an object-centric information model. The aim of this concept is to provide efective management of product requirements in the context of rising volume and complexity of requirements. Firstly, benefts and challenges between the current document-centric way of managing requirements and the novel object-centric concept for managing requirements are mapped and presented. Secondly, the organizational prerequisites necessary for adopting the object-centric product requirement applicability concept are explored and insights into how this transition can be facilitated with knowledge management are generated. Thirdly, the accuracy of this novel concept is evaluated in terms of applying the correct requirements onto products and not more nor less. The methods in this thesis are of mixed nature and combine qualitative and quantitative data sets. The quantitative data was collected from the analysis of the requirement applicability concept and the previous tests of the same concept on diferent products at Company X. The analysis of quantitative data for concept accuracy revolved around interpreting the nature of negative- and positive delta values following an automatic application of requirements in the novel object-centric concept. The former of these delta values relates to relevant requirements missing onto products whereas the latter relates to non-relevant requirements being applied following the automatic application. The qualitative data was collected mainly through a comprehensive literature review, semi-structured interviews, a workshop, meetings, Company X documents and observations. The analysis of qualitative data was done via an adaptation of two frameworks: The Multi-Dimensional Framework for Digital Transformation by Hanelt et al, 2021 and The Process of Organizational Knowledge Creation by Nonaka, 1994. These adapted frameworks were leveraged to generate insights into the necessary organizational prerequisites for adopting the object-centric product requirement applicability concept. While benefts and challenges exist between the two diferent ways of managing requirements, these are not weighted and thus no defnitive conclusion on best practice is drawn. Instead these fndings provide organizations a reference point for evaluating each approach in relation to their own circumstances. Some of the prerequisites identifed as critical for adopting an object-centric product requirement applicability model are a digital library of requirement specifcations that are convertible to object-centric data format, and a suitable PLM strategy chosen by the Top Management Teams that is aiding the development of the concept. As for knowledge management in order to transition to the new concept, fndings show that standardized knowledge creation and knowledge exchange is necessary as well as developing centralized knowledge databases. Lastly, the quantitative fndings indicate that the applicability concept is accurate, but that there is a need for additional iterations with model improvements where the negative delta can be eliminated and the positive delta held at an acceptable level. / Den här studien ger insikter om genomförbarheten av att hantera produktkrav i ett koncept som utnyttjar en objektcentrerad informationsmodell. Syftet med detta koncept är att tillhandahålla efektiv hantering av produktkrav i samband med att de ökar i volym samt komplexitet. Studien kartlägger först fördelar och utmaningar mellan det nuvarande dokumentcentrerade sättet att hantera krav och det nya objektcentrerade konceptet. Även de organisatoriska förutsättningar som krävs för att införa det objektcentrerade produktkravskonceptet undersöks i samband med hur denna övergång kan underlättas med hjälp av kunskapshantering. Slutligen så utvärderas det objektcentrerade konceptet för kravapplicering beträfande hur exakt konceptet är i att applicera rätt krav på produkter. Metodiken i denna studie kombinerar kvalitativa och kvantitativa datamängder. Den kvantitativa datan samlades in från analys av kravappliceringskonceptet samt de tidigare testerna av konceptet på olika produkter som gjorts av Företag X. Analysen av kvantitativ data för hur exakt konceptet är i att applicera rätt krav kretsar kring att förstå grundorsakerna till de negativa och positiva deltavärden som uppkommit efter en automatisk applicering av krav i det nya objektcentrerade konceptet. Det negativa deltavärdet relaterar till nödvändiga krav som saknas på produkter, medan det positiva deltavärdet avser icke-relevanta krav som har applicerats efter en systemförfrågan och automatisk applicering av krav. Den kvalitativa datan samlades huvudsakligen in genom en omfattande litteraturstudie, semistrukturerade intervjuer, workshops, möten, dokument från Företag X och observationer. Analysen av kvalitativ data gjordes via en anpassning av två vetenskapliga ramverk: The Multi-Dimensional Framework for Digital Transformation av Hanelt et al, 2021 och The Process of Organizational Knowledge Creation av Nonaka, 1994. Dessa anpassade ramverk utnyttjas för att generera insikter i de nödvändiga organisatoriska förutsättningarna som krävs för att införa det objektcentrerade kravappliceringskonceptet i en organisation. Fördelar och utmaningar med de två olika sätten att hantera krav viktades inte i denna studie, därför dras ingen defnitiv slutsats om bästa praxis. Istället ger dessa fynd en referenspunkt till organisationer att utvärdera bästa praxis i förhållande till sina egna omständigheter. Några av förutsättningarna identiferade som kritiska för att tillämpa en objektcentrerad produktkravappliceringsmodell är ett digitalt bibliotek av kravspecifkationer som är konverterbara till objektcentrerat dataformat, samt en PLM-strategi införd av de högsta ledningsgrupperna som hjälper till med stödja det objekcentrerade konceptet för produktkravsapplicering. Vad gäller kunskapshantering för att övergå till det nya konceptet visar resultaten att standardiserad kunskapsskapande och utbyte av kunskap är nödvändigt, samt utvecklingen av centraliserade kunskapsdatabaser. Slutligen indikerar de kvantitativa resultaten att tillämpningskonceptet är noggrant, men att ytterligare iterationer med modellförbättringar behövs där den negativa deltan kan elimineras och där den positiva deltan hålls på en acceptabel nivå.
|
153 |
Návrh mezioperační dopravy ve výrobním podniku podle principů Průmyslu 4.0 / Design of inter-operational transport in a manufacturing company according to the Industry 4.0 conceptMravec, Roman January 2021 (has links)
Based on the description and definition of technology and processes falling within the vision of the fourth industrial revolution with the aim of creating intelligent factories, this diploma thesis deals with the principles of the Industry 4.0 concept in Hilti's production plant with a focus on transport and supply of production equipment. The aim of the work is to create a comprehensive proposal that takes into account all the necessary aspects associated with upgrading the existing state of inter-operational transport in a particular production line to fully automated, flexible and autonomous transport of materials and products in the context of Industry 4.0. A prerequisite for creating a design is the connection of automatically guided vehicles (AGVs) serving individual transport orders. The selection of the vehicle was made taking into account the safety of movement, the method of charging, the system and network integrity of existing and proposed technologies and components. The intention is not only to automate the inter-operational service, but also on the basis of the created automation concept, the ability to autonomously procure the flow of material and products. The mathematical calculation of capacity planning in the production line helped to determine the total load and the number of vehicles needed for continuous procurement of transport requirements. The result of the design part is also the design of specific transport routes and transport conditions that AGV vehicles must comply with in order to maintain a high level of safety. Transparency and a constant overview of transported products is provided by the presented scheme for identification of production batches, Auto-ID system. The financial efficiency of the whole project elaborated in the diploma thesis is evaluated as payable after 4 years from the implementation of the proposal. The financial efficiency of the whole project elaborated in the diploma thesis is evaluated as payable after 4 years from the implementation of the proposal due to high labor costs.
|
154 |
Návrh a implementace řídícího programu pro CNC obráběcí stroj prostřednictvím B&R Automation / Design and implementation of control program for CNC machine via B&R AutomationVavrík, Michal January 2020 (has links)
The aim of the diploma thesis is the robotization of a conventional milling machine to a CNC milling machine using a programmable logic controller. The theoretical part of the thesis contains a description of machine tools and industry 4.0, an overview of Optimum Maschinen and B&R Automation companies, including products, and a description of the G code programming language. At the beginning of the practical part, the selected hardware, its connection and tuning of motors are described. Furthermore, the practical part explains the method of motor control and processing of CNC programs in G code. The following chapters discuss the creation of industrial visualizations for machine control and digital twin for testing purposes. The conclusion describes the evaluation of the results and indicates the possibilities for future expansion of the machine and its implementation in an automated cell in the sense of industry 4.0.
|
155 |
DT-DNA: Devising a DNA Paradigm for Modeling Health Digital TwinsBadawi, Hawazin Faiz 19 March 2021 (has links)
The potential of Digital twin (DT) technology outside of the industrial field has been recognized by researchers who have promoted the vision of applying DTs technology beyond manufacturing, to purposes such as enhancing human well-being and improving quality of life (QoL). The expanded definition of DTs to incorporate living and nonliving physical entities into the definition of DTs was a key motivation behind the model introduced in this thesis for building health digital twins of citizens. In contrast with DTs that have been developed in more industrial fields, this type of digital twins modeling necessitates protecting each citizen's unique identity while also representing features common to all citizens in a unified way. In nature, DNA is an example of a model that is both unified, common to all humans, and unique, distinguishing each human as an individual. DNA’s architecture is what inspired us to propose a digital twin DNA (DT-DNA) model as the basis for building health DTs for citizens. A review of the literature shows that no unified model for citizens’ health has been developed that can act as a base for building digital twins of citizens while also protecting their unique identity thus we aim to fill this gap in this research. Accordingly, in this thesis, we proposed a DT-DNA model, which is specifically designed to protect the unique identity of each citizen’s digital twin, similar to what DNA does for each human. We also proposed a DT-DNA-based framework to build standardized health digital twins of citizens on micro, meso and macro levels using two ISO standards: ISO/IEEE 11073 (X73) and ISO 37120. To achieve our goal, we started by analyzing the biological DNA model and the influencing factors shaping health in smart cities. The purpose of the first is to highlight the DNA model features which provide the building blocks for our DT-DNA model. The purpose of the latter is to determine the main bases of our DT-DNA model of health DTs. Based on the analysis results; we proposed DT-DNA to model health DTs for citizens. In keeping with our DNA analogy, we have identified four bases, A, T, G, and C, for our unified and unique DT-DNA model. The A base in the proposed model represents a citizen’s anthropometric when we build the DT-DNA on an individual level and represents the city’s regulatory authorities when we build the DT-DNA on community and city levels. The T base represents different tasks included in the provided health data that are required to model citizens’ health DT-DNA on different levels. The G base represents the geographic and temporal information of the city, where the citizen exists at the time of data collection. The C base represents the context at the time of data collection. To proof the concept, we present our initial work on building health DTs for citizens in four case studies. The first two case studies are dedicated for health DTs at the micro level, the third case study is dedicated for health DTs at the meso level and the fourth case study is dedicated for health DTs at the macro level. In addition, we developed an algorithm to compare cities in terms of their community fitness and health services status. The four case studies provide promising results in terms of applicability of the proposed DT-DNA model and framework in handling the health data of citizens, communities and cities, collected through various sources, and presenting them in a standardized, unique model.
|
156 |
INTEGRATION OF PRODUCT LIFECYCLE BEHAVIOR INTO COMPONENT DESIGN, MANUFACTURING AND PERFORMANCE ANALYSIS TO REALIZE A DIGITAL TWIN REPRESENTATION THROUGH A MODEL-BASED FEATURE INFORMATION NETWORKSaikiran Gopalakrishnan (12442764) 22 April 2022 (has links)
<p> </p>
<p>There has been a growing interest within the aerospace industry for shifting towards a digital twin approach, for reliable assessment of individual components during the product lifecycle - across design, manufacturing, and in-service maintenance, repair & overhaul (MRO) stages. The transition towards digital twins relies on continuous updating of the product lifecycle datasets and interoperable exchange of data applicable to components, thereby permitting engineers to utilize current state information to make more-informed downstream decisions. In this thesis, we primarily develop a framework to store, track, update, and retrieve product lifecycle data applicable to a serialized component, its features, and individual locations. </p>
<p>From a structural integrity standpoint, the fatigue performance of a component is inherently tied to the component geometry, its material state, and applied loading conditions. The manufacturing process controls the underlying material microstructure, which in turn governs the mechanical properties and ultimately the performance. The processing also controls the residual stress distributions within the component volume, which influences the durability and damage tolerance of the component. Hence, we have demonstrated multiple use cases for fatigue life assessment of critical aerospace components, by using the developed framework for efficiently tracking and retrieving (i) the current geometric state, (ii) the material microstructure state, and (iii) residual stress distributions.</p>
<p>Model-based definitions (MBDs) present opportunities to capture both geometric and non-geometric data using 3D computer-aided design (CAD) models, with the overarching aim to disseminate product information across different stages of the lifecycle. MBDs can potentially eliminate error-prone information exchange associated with traditional paper-based drawings and improve the fidelity of component details, captured using 3D CAD models. However, current CAD capabilities limit associating the material information with the component’s shape definition. Furthermore, the material attributes of interest, viz., material microstructures and residual stress distributions, can vary across the component volume. To this end, in the first part of the thesis, we implement a CAD-based tool to store and retrieve metadata using point objects within a CAD model, thereby creating associations to spatial locations within the component. The tool is illustrated for storage and retrieval of bulk residual stresses developed during the manufacturing of a turbine disk component, acquired from process modeling and characterization. Further, variations in residual stress distribution owing to process model uncertainties have been captured as separate instances of the disk’s CAD models to represent part-to-part variability as an analogy to track individual serialized components for digital twins. The propagation of varying residual stresses from these CAD models within the damage tolerance analysis performed at critical locations in the disk has been demonstrated. The combination of geometric and non-geometric data inside the MBD, via storage of spatial and feature varying information, presents opportunities to create digital replica or digital twin(s) of actual component(s) with location-specific material state information.</p>
<p>To fully realize a digital twin description of components, it is crucial to dynamically update information tied to a component as it evolves across the lifecycle, and subsequently track and retrieve current state information. Hence, in the second part of the thesis, we propose a dynamic data linking approach to include material information within the MBDs. As opposed to storing material datasets directly within the CAD model in the previous approach, we externally store and update the material datasets and create data linkages between material datasets and features within the CAD models. To this end, we develop a model-based feature information network (MFIN), a software agnostic framework for linking, updating, searching, and retrieving of relevant information across a product’s lifecycle. The use case of a damage tolerance analysis for a compressor bladed-disk (blisk) is demonstrated, wherein Ti-6Al-4V blade(s) are linear friction welded to the Ti-6Al-4V disk, comprising well-defined regions exhibiting grain refinement and high residuals stresses. By capturing the location-specific microstructural information and residual stress fields at the weld regions, this information was accessed within the MFIN and used for downstream damage tolerant analysis. The introduction of the MFIN framework facilitates access to dynamically evolving as well as location-specific data for use within physics-based models.</p>
<p>In the third part of thesis, we extend the MFIN framework to enable a physics-based, microstructure sensitive and location-specific fatigue life analysis of a component. Traditionally, aerospace components are treated as monolithic structures during lifing, wherein microstructural information at individual locations are not necessarily considered. The resulting fatigue life estimates are conservative and associated with large uncertainty bounds, especially in components with gradient microstructures or distinct location-specific microstructures, thereby leading to under usage of the component’s capabilities. To improve precision in the fatigue estimates, a location-specific lifing framework is enabled via MFIN, for tracking and retrieval of microstructural information at distinct locations for subsequent use within a crystal plasticity-based fatigue life prediction model. A use case for lifing dual-microstructure heat treated LSHR turbine disk component is demonstrated at two locations, near the bore (fine grains) and near the rim (coarse grains) regions. We employ the framework to access (a) the grain size statistics and (b) the macroscopic strain fields to inform precise boundary conditions for the crystal plasticity finite-element analysis. The illustrated approach to conduct a location-specific predictive analysis of components presents opportunities for tailoring the manufacturing process and resulting microstructures to meet the component’s targeted requirements.</p>
<p>For reliably conducting structural integrity analysis of a component, it is crucial to utilize their precise geometric description. The component geometries encounter variations from nominal design geometries, post manufacturing or after service. However, traditionally, stress analyses are based on nominal part geometries during assessment of these components. In the last part of the thesis, we expand the MFIN framework to dynamically capture deviations in the part geometry via physical measurements, to create a new instance of the CAD model and the associated structural analysis. This automated workflow enables engineers for improved decision-making by assessing (i) as-manufactured part geometries that fall outside of specification requirements during the materials review board or (ii) in-service damages in parts during the MRO stages of the lifecycle. We demonstrate a use case to assess the structural integrity of a turbofan blade that had experienced foreign object damage (FOD) during service. The as-designed geometry was updated based on coordinate measurements of the damaged blade surfaces, by applying a NURBS surface fit, and subsequently utilized for downstream finite-element stress analysis. The ramifications of the FOD on the local stresses within the part are illustrated, providing critical information to the engineers for their MRO decisions. The automated flow of information from geometric inspection within structural analysis, enabled by MFIN, presents opportunities for effectively assessing products by utilizing their current geometries and improving decision-making during the product lifecycle.</p>
|
157 |
Production 4.0 of Ring Mill 4 Ovako ABHassan, Muhammad January 2020 (has links)
Cyber-Physical System (CPS) or Digital-Twin approach are becoming popular in industry 4.0 revolution. CPS not only allow to view the online status of equipment, but also allow to predict the health of tool. Based on the real time sensor data, it aims to detect anomalies in the industrial operation and prefigure future failure, which lead it towards smart maintenance. CPS can contribute to sustainable environment as well as sustainable production, due to its real-time analysis on production. In this thesis, we analyzed the behavior of a tool of Ringvalsverk 4, at Ovako with its twin model (known as Digital-Twin) over a series of data. Initially, the data contained unwanted signals which is then cleaned in the data processing phase, and only before production signal is used to identify the tool’s model. Matlab’s system identification toolbox is used for identifying the system model, the identified model is also validated and analyzed in term of stability, which is then used in CPS. The Digital-Twin model is then used and its output being analyzed together with tool’s output to detect when its start deviate from normal behavior.
|
158 |
Improving supply chain visibility within logistics by implementing a Digital Twin : A case study at Scania Logistics / Att förbättra synlighet inom logistikkedjor genom att implementera en Digital Tvilling : En fallstudie på Scania LogisticsBLOMKVIST, YLVA, ULLEMAR LOENBOM, LEO January 2020 (has links)
As organisations adapt to the rigorous demands set by global markets, the supply chains that constitute their logistics networks become increasingly complex. This often has a detrimental effect on the supply chain visibility within the organisation, which may in turn have a negative impact on the core business of the organisation. This paper aims to determine how organisations can benefit in terms of improving their logistical supply chain visibility by implementing a Digital Twin — an all-encompassing virtual representation of the physical assets that constitute the logistics system. Furthermore, challenges related to implementation and the necessary steps to overcome these challenges were examined. The results of the study are that Digital Twins may prove beneficial to organisations in terms of improving metrics of analytics, diagnostics, predictions and descriptions of physical assets. However, these benefits come with notable challenges — managing implementation and maintenance costs, ensuring proper information modelling, adopting new technology and leading the organisation through the changes that an implementation would entail. In conclusion, a Digital Twin is a powerful tool suitable for organisations where the benefits outweigh the challenges of the initial implementation. Therefore, careful consideration must be taken to ensure that the investment is worthwhile. Further research is required to determine the most efficient way of introducing a Digital Twin to a logistical supply chain. / I takt med att organisationer anpassar sig till de hårda krav som ställs av den globala marknaden ökar också komplexiteten i deras logistiknätverk. Detta har ofta en negativ effekt på synligheten inom logistikkedjan i organisationen, vilken i sin tur kan ha en negativ påverkan på organisationens kärnverksamhet. Målet med denna studie är att utröna de fördelar som organisationer kan uppnå vad gäller att förbättra synligheten inom deras logistikkedjor genom att implementera en Digital Tvilling — en allomfattande virtuell representation av de fysiska tillgångar som utgör logistikkedjan. Resultaten av studien är att Digitala Tvillingar kan vara gynnsamma för organisationer när det gäller att förbättra analys, diagnostik, prognoser och beskrivningar av fysiska tillgångar. Implementationen medför dock utmaningar — hantering av implementations- och driftskostnader, utformning av informationsmodellering, anammandet av ny teknik och ledarskap genom förändringsarbetet som en implementering skulle innebära. Sammanfattningsvis är en Digital Tvilling ett verktyg som lämpar sig för organisationer där fördelarna överväger de utmaningar som tillkommer med implementationen. Därmed bör beslutet om en eventuell implementation endast ske efter noggrant övervägande. Vidare forskning behöver genomföras för att utröna den mest effektiva metoden för att introducera en Digital Tvilling till en logistikkedja.
|
159 |
Digital Twin Development and Advanced Process Control for Continuous Pharmaceutical ManufacturingYan-Shu Huang (9175667) 25 July 2023 (has links)
<p>To apply Industry 4.0 technologies and accelerate the modernization of continuous pharmaceutical manufacturing, digital twin (DT) and advanced process control (APC) strategies are indispensable. The DT serves as a virtual representation that mirrors the behavior of the physical process system, enabling real-time monitoring and predictive capabilities. Consequently, this facilitates the feasibility of real-time release testing (RTRT) and enhances drug product development and manufacturing efficiency by reducing the need for extensive sampling and testing. Moreover, APC strategies are required to address variations in raw material properties and process uncertainties while ensuring that desired critical quality attributes (CQAs) of in-process materials and final products are maintained. When deviations from quality targets are detected, APC must provide optimal real-time corrective actions, offering better control performance than the traditional open loop-control method. The progress in DT and APC is beneficial in shifting from the paradigm of Quality-by-Test (QbT) to that of Quality-by-Design (QbD) and Quality-by-Control (QbC), which emphasize the importance of process knowledge and real-time information to ensure product quality.</p>
<p><br></p>
<p>This study focuses on four key elements and their applications in a continuous dry granulation tableting process, including feeding, blending, roll compaction, ribbon milling and tableting unit operations. Firstly, the necessity of a digital infrastructure for data collection and integration is emphasized. An ISA-95-based hierarchical automation framework is implemented for continuous pharmaceutical manufacturing, with each level serving specific purposes related to production, sensing, process control, manufacturing operations, and business planning. Secondly, investigation of process analytical technology (PAT) tools for real-time measurements is highlighted as a prerequisite for effective real-time process management. For instance, the measurement of mass flow rate, a critical process parameter (CPP) in continuous manufacturing, was previously limited to loss-in-weight (LIW) feeders. To overcome this limitation, a novel capacitance-based mass flow sensor, the ECVT sensor, has been integrated into the continuous direct compaction process to capture real-time powder flow rates downstream of the LIW feeders. Additionally, the use of near-infrared (NIR)-based sensor for real-time measurement of ribbon solid fraction in dry granulation processes is explored. Proper spectra selection and pre-processing techniques are employed to transform the spectra into useful real-time information. Thirdly, the development of quantitative models that establish a link between CPPs and CQAs is addressed, enabling effective product design and process control. Mechanistic models and hybrid models are employed to describe the continuous direct compaction (DC) and dry granulation (DG) processes. Finally, applying APC strategies becomes feasible with the aid of real-time measurements and model predictions. Real-time optimization techniques are used to combine measurements and model predictions to infer unmeasured states or mitigate the impact of measurement noise. In this work, the moving horizon estimation-based nonlinear model predictive control (MHE-NMPC) framework is utilized. It leverages the capabilities of MHE for parameter updates and state estimation to enable adaptive models using data from the past time window. Simultaneously, NMPC ensures satisfactory setpoint tracking and disturbance rejection by minimizing the error between the model predictions and setpoint in the future time window. The MHE-NMPC framework has been implemented in the tableting process and demonstrated satisfactory control performance even when plant model mismatch exists. In addition, the application of MHE enables the sensor fusion framework, where at-line measurements and online measurements can be integrated if the past time window length is sufficient. The sensor fusion framework proves to be beneficial in extending the at-line measurement application from just validation to real-time decision-making.</p>
|
160 |
DESIGN AND DEVELOPMENT OF A REAL-TIME CYBER-PHYSICAL TESTBED FOR CYBERSECURITY RESEARCHVasileios Theos (16615761) 03 August 2023 (has links)
<p>Modern reactors promise enhanced capabilities not previously possible including integration with the smart grid, remote monitoring, reduced operation and maintenance costs, and more efficient operation. . Modern reactors are designed for installation to remote areas and integration to the electric smart grid, which would require the need for secure undisturbed remote control and the implementation of two-way communications and advanced digital technologies. However, two-way communications between the reactor facility, the enterprise network and the grid would require continuous operation data transmission. This would necessitate a deep understanding of cybersecurity and the development of a robust cybersecurity management plan in all reactor communication networks. Currently, there is a limited number of testbeds, mostly virtual, to perform cybersecurity research and investigate and demonstrate cybersecurity implementations in a nuclear environment. To fill this gap, the goal of this thesis is the development of a real-time cyber-physical testbed with real operational and information technology data to allow for cybersecurity research in a representative nuclear environment. In this thesis, a prototypic cyber-physical testbed was designed, built, tested, and installed in PUR-1. The cyber-physical testbed consists of an Auxiliary Moderator Displacement Rod (AMDR) that experimentally simulates a regulating rod, several sensors, and digital controllers mirroring Purdue University Reactor One (PUR-1) operation. The cyber-physical testbed is monitored and controlled remotely from the Remote Monitoring and Simulation Station (RMSS), located in another building with no line of sight to the reactor room. The design, construction and testing of the cyber-physical testbed are presented along with its capabilities and limitations. The cyber-physical testbed network architecture enables the performance of simulated cyberattacks including false data injection and denial of service. Utilizing the RMSS setup, collected information from the cyber-physical testbed is compared with real-time operational PUR-1 data in order to evaluate system response under simulated cyber events. Furthermore, a physics-based model is developed and benchmarked to simulate physical phenomena in PUR-1 reactor pool and provide information about reactor parameters that cannot be collected from reactor instrumentation system.</p>
|
Page generated in 0.0497 seconds