• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 61
  • 12
  • 5
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 112
  • 112
  • 23
  • 17
  • 16
  • 15
  • 12
  • 12
  • 12
  • 11
  • 10
  • 10
  • 10
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Um projeto de gestão e engenharia de conhecimento para controle da produção : um estudo de caso na fabricação de papel / A knowledge engineering and management project : an study in paper production

Dummer, Edson January 2003 (has links)
Gestão de conhecimento busca criar, adquirir, compartilhar e utilizar ativos de conhecimento produzidos ou dispersos na organização, de forma a garantir o controle pela organização de um importante componente de seu produto ou serviço. A engenharia de conhecimento fornece os instrumentos que permitem identificar as fontes de conhecimento, e ajuda a extrair e modelar esse conhecimento, quando de fontes humanas, de forma a utilizá-lo de forma independente das pessoas. Este trabalho apresenta um estudo de caso no desenvolvimento de um projeto de gestão de conhecimento para a área de produção de uma empresa que atua na fabricação de papel, com um modelo de processo contínuo. O modelo de produção é particularmente interessante pelo fato de sofrer permanente monitoração e eventuais ações de reparo, cujo efeito será sentido apenas no final do processo produtivo. Ações erradas ou ineficientes acarretam significativas perdas de produto, com grande impacto no ganho da empresa. O objetivo deste trabalho é a identificação dos conhecimentos envolvidos na detecção de desvios de padrão no processo produtivo e nas ações de reparo para produção de papel. A partir do estudo do processo e da identificação dos insumos de conhecimento, foi proposto um modelo da área de produção onde são mapeados os ativos de conhecimentos, os processos e agentes que utilizam esses conhecimentos. Esse modelo foi racionalizado de forma a otimizar a utilização do conhecimento e homogeneizar o processo de solução de problemas de acordo com o padrão desejado pela empresa. A Metodologia CommonKADS foi utilizada para conduzir o desenvolvimento desse modelo. Esse trabalho demonstrou que a tarefa de monitoração e reparo da linha de produção utiliza conhecimentos tácitos ou não estruturados para a solução de problemas, juntamente com o conhecimento explícito obtido através de treinamento ou orientação da organização. Como resultado, a qualidade da solução será diferente dependendo do agente responsável pela monitoração naquele momento. Através das técnicas de entrevistas, análise de protocolo, observações e repertory grid, esses conhecimentos foram explicitados e formalizados, sendo disponibilizados como normas para todos os agentes, atingindo uma desejável homogeneidade no processo produtivo que leva a diminuição do refugo de produção. A suite de modelos da organização, tarefa, comunicação e agentes de Common KADS foi avaliada como ferramenta de suporte à identificação e formalização dos insumos de conhecimento da organização. Foi proposto um modelo OM-6 a ser incluído como um modelo da organização para avaliação dos procedimentos de correção. O modelo obtido é a etapa inicial para o projeto de um sistema de conhecimento que automatize a tarefa de monitoração e reparo do processo de produção de papel. / Knowledge management aims to create, acquire, share and employ knowledge issues produced or disseminated in the organisation, in order to guarantee the full control over an important component of the organisation product and service. Knowledge engineering provides adequate tools to identify the knowledge sources and helps in extracting and modelling the knowledge, allowing to be employed in human independent way. This work presents the results of a detailed study about the development of a knowledge management project for a production area in a filter paper factory, whose production line is carry out as a continuous process. The production model is particularly interesting because the process demands continuous and permanent monitoring, but the effectiveness of the corrective actions can be perceived only in the final stage of the production process. Wrong actions cause significant amount of the product rejection with strong impact in the company returns. The goal of this study is identifying the knowledge applied by agents in detecting pattern deviation in the production process and in the repair actions during the paper production. A model of the production area, mapping knowledge issues, agents and tasks, was proposed from the acknowledgement of the process and the knowledge assets. This model was rationalised in order to improve the effectiveness of the knowledge application and guarantee an homogeneous problem-solving process, according to the company standard. The CommonKADS methodology defines the model development. This work demonstrates that, during the monitoring and repair task, the agents apply tacit unstructured knowledge, besides the explicit knowledge acquired through training into the organisation, to achieve the solution. The result will be a dependence between the agent and the quality of the production. These knowledge were made explicit and formalised through the use of interviews, protocol analysis and repertory grid, and was further shared as norms to the whole team of agents. These norms provide a desirable homogeneity in the production process leading to a reduction on the production refuse. The CommonKADS model suit – organisation, task, communication and agent models – was evaluated as a supporting tool to the identification and formalisation of organisation knowledge. An additional model, called OM-6, was proposed as an extension to the CommonKADS suit to support for valuation of the correction procedures. The obtained organization model is the first step in a knowledge system project for the automation of the monitoring and repair tasks in the paper production process.
72

Membrane Bioreactor-based Wastewater Treatment Plant Energy Consumption: Environmental Data Science Modeling and Analysis

Cheng, Tuoyuan 10 1900 (has links)
Wastewater Treatment Plants (WWTPs) are sophisticated systems that have to sustain long-term qualified performance, regardless of temporally volatile volumes or compositions of the incoming wastewater. Membrane filtration in the Membrane Bioreactors (MBRs) reduces the WWTPs footprint and produces effluents of proper quality. The energy or electric power consumption of the WWTPs, mainly from aeration equipment and pumping, is directly linked to greenhouse gas emission and economic input. Biological treatment requires oxygen from aeration to perform aerobic decomposition of aquatic pollutants, while pumping consumes energy to overcome friction in the channels, piping systems, and membrane filtration. In this thesis, we researched full-scale WWTPs Influent Conditions (ICs) monitoring and forecasting models to facilitate the energy consumption budgeting and raise early alarms when facing latent abnormal events. Accurate and efficient forecasts of ICs could avoid unexpected system disruption, maintain steady product quality, support efficient downstream processes, improve reliability and save energy. We carried out a numerical study of bioreactor microbial ecology for MBRs microbial communities to identify indicator species and typical working conditions that would assist in reactor status confirmation and support energy consumption budgeting. To quantify membrane fouling and cleaning effects at various scales, we proposed quantitative methods based on Matern covariances to analyze biofouling layer thickness and roughness obtained from Optical Coherence Tomography (OCT) images taken from gravitydriven MBRs under various working conditions. Such methods would support practitioners to design suitable data-driven process operation or replacement cycles and lead to quantified WWTPs monitoring and energy saving. For future research, we would investigate data from other full-scale water or wastewater treatment process with higher sampling frequency and apply kernel machine learning techniques for process global monitoring. The forecasting models would be incorporated into optimization scenarios to support data-driven decision-making. Samples from more MBRs would be considered to gather information of microbial community structures and corresponding oxygen-energy consumption in various working conditions. We would investigate the relationship between pressure drop and spatial roughness measures. Anisotropic Matern covariance related metrics would be adopted to quantify the directional effects under various operation and cleaning working conditions.
73

On-line shear and extensional rheometry of polymer melts in the extrusion process.

Kelly, Adrian L. January 1997 (has links)
A novel on-line capillary rheometer (OLR) was used to examine the shear and extensional characteristics of polyolefin melts during twin screw extrusion (TSE). Comparisons with off-line rheometry were made using a twin-bore capillary rheometer and a modular in-line slit die rheometer (ILR) provided in-line rheometry comparisons. Both capillary rheometers were controlled via PCs running dedicated software, and the extrusion line and ELR were fully instrumented allowing real-time process monitoring to be carried out by IBM compatible PCs via data acquisition hardware and software. The prototype OLR was developed by the re-design of several key features including an instrumented transfer section and capillary die block which facilitated the use of various die geometries. Shear and extensional on-line rheometry of three polyethylenes (linear and branched), and four molecular weight grades of polypropylene were examined, and a direct comparison with off-line capillary rheometry showed a good correlation. The effect of a high loading of filler on two of the polyethylenes was investigated. In-line shear stress and entry pressure measurements showed a reasonable correlation with on-line rheometry. A study of entry flows in the OLR using capillary dies approaching orifice showed non-linearities occurred at very low capillary length to diameter(L:D) ratios, and this was repeatable using off-line rheometry. Predicted zero length entry pressures (Po) were used to estimate apparent extensional viscosity using a number of standard models. Melt instability and capillary wall slip were also investigated using on-line rheometry. Melt pressure and temperature in the twin screw extruder and OLR were monitored at various process conditions to examine the ability of the OLR to condition melt during testing, and the effect of OLR testing on extrusion conditions. Pressure variation in the extruder, OLR and off-line rheometer were compared in order to quantify process noise. The effect of OLR testing on melt rheology and polymer molecular weight were examined using off-line rheometry and gel permeation chromatography(GPC). / Rosand Precision Ltd. and Raychem Ltd.
74

Spatio-temporal Analysis for Semantic Monitoring of Agricultural Logistics

Deeken, Henning 18 October 2022 (has links)
Managing agricultural processes with significant logistics sub-processes is a challenge because coordinating a distributed fleet in a dynamic environment is difficult without proper oversight in terms of qualitative and quantitative process information. Digital assistance systems are thought to aid agricultural practitioners by providing process-related information and thus support operational decision-making or even control the logistic flow (semi-)automatically. However, their development is currently stifled by a lack of monitoring capabilities during process execution. This thesis concerns the topic of online process monitoring for ongoing agricultural logistic processes. It discusses how to extract process knowledge from the telemetry of agricultural machines by applying spatio-semantic reasoning techniques. Our method combines spatial analysis for identifying spatial relationships between machines and their environment with semantic inference to derive formal process knowledge through ontological and rule-based reasoning. To test our concepts, we implemented a domain-agnostic semantic mapping framework and applied it in the context of forage maize harvesting. We present custom-made ontological models and rules to represent agricultural environments and to reason about machine actors and their process states. Based on our prototype, we demonstrate how to implement automated process and service tracking in near-real-time. Finally, we discuss the role of online process analytics systems in the context of other agricultural assistance systems for farm and fleet management.
75

Fast error detection method for additive manufacturing process monitoring using structured light three dimensional imaging technique

Jack Matthew Girard (17584095) 19 January 2024 (has links)
<p dir="ltr">Monitoring of additive manufacturing (AM) processes allows for saving time and materials by detecting and addressing errors as they occur. When fast and efficient, the monitored AM of each unit can be completed in less time, thus improving overall economics and allowing the user to accept a higher capacity of AM requests with the same number of machines. Based on existing AM process monitoring solutions, it is very challenging for any approach to analyze full-resolution sensor data that yields three-dimensional (3D) topological information for closed-loop real-time applications. It is also challenging for any approach to be simultaneously capable of <i>plug-and-play</i> operation once AM hardware and sensor subsystems are configured. This thesis presents a novel method to speed up error detection in an additive manufacturing (AM) process by minimizing the necessary three-dimensional (3D) reconstruction and comparison. A structured light 3D imaging technique is developed that has native pixel-by-pixel mapping between the captured two-dimensional (2D) absolute phase image and the reconstructed 3D point cloud. This 3D imaging technique allows error detection to be performed in the 2D absolute phase image domain prior to 3D point cloud generation, which drastically reduces complexity and computational time. For each layer of an AM process, an artificial threshold phase image is generated and compared to the measured absolute phase image to identify error regions. Compared to an existing AM error detection method based on 3D reconstruction and point cloud processing, experimental results from a material extrusion (MEX) AM process demonstrate that the proposed method has comparable error detection capabilities. The proposed method also significantly increases the error detection speed, where the relationship between the speed improvement factor and the percentage of erroneous pixels in the captured 2D image follows a power-law relationship. The proposed method was also successfully used to implement closed-loop error correction to demonstrate a potential process monitoring application.</p>
76

Process Fingerprinting of Microneedle Manufacturing Using Conventional and Ultrasonic Micro-injection Moulding

Gulcur, Mert January 2019 (has links)
This research work investigates the development and application of process fingerprinting for conventional micro-injection moulding and ultrasonic micro injection moulding manufacturing of microneedle arrays for drug delivery. The process fingerprinting method covers in-depth analysis, interrogation and selection of certain process data features and correlation of these features with product fingerprints which are defined by the geometrical outcomes of the microneedle arrays in micro scale. The method was developed using the data collected using extensive sensor technologies attached to the conventional and ultrasonic micromoulding machines. Moreover, a machine vision based microneedle product evaluation apparatus is presented. Micromachining capabilities of different processes is also assessed and presented where state-of-the-art laser machining was used for microneedle tool manufacturing in the work. By using process fingerprinting procedures, conventional and ultrasonic micromoulding processes has been characterised thoroughly and aspects of the process that is affecting the part quality was also addressed for microneedle manufacturing. It was found that polymer structure is of paramount importance in obtaining sufficient microneedle replication. An amorphous polymer have been found to be more suitable for conventional moulding whereas semi-crystalline materials performed better in ultrasonic micromoulding. In-line captured micromoulding process data for conventional and ultrasonic moulding provided detailed insight of machine dynamics and understanding. Linear correlations between process fingerprints and micro replication efficiency of the microneedles have been presented for both micromoulding technologies. The in-line process monitoring and product quality evaluation procedures presented in this work for micro-injection moulding techniques will pave ways for zero-defect micromanufacturing of miniature products towards Industry 4.0.
77

Prognostics and Health Management of Engineering Systems Using Minimal Sensing Techniques

Davari Ardakani, Hossein 09 September 2016 (has links)
No description available.
78

A PROCESS MONITORING EVALUATION OF A NURSE-LED REMOTE AUTOMATED MONITORING AND VIRTUAL CARE INTERVENTION

Ouellette, Carley January 2020 (has links)
Clinical trials involving digital health technologies are complex and challenging deployments. The SMArTVIEW trial (n=800), underway, combines remote automated patient monitoring (RAM) in hospital and virtual hospital-to-home nursing support, up to 30-days post-discharge, for patients who have undergone cardiac and major vascular surgery. Cardiac and vascular surgery patients are at risk for postoperative complications, as well as hospital readmission; SMArTVIEW aims to reduce hospital readmissions and emergency department visits. The purpose of this work was to conduct a process monitoring evaluation of the first 100 patients enrolled in order to examine the implementation, mechanisms, context, and specialized nursing role of the SMArTVIEW intervention. Six data sources were used to examine patient recruitment, daily nursing intervention workflows, RAM technology compliance, technical troubleshooting, patient education, and virtual nursing care. A content analysis was used to identify nursing advice, recommendations, and corrective actions for patients requiring intervention recovering at home. Fifty patients were allocated to the SMArTVIEW intervention; of these, 34 engaged in all intervention components, both in-hospital and at home. In-hospital RAM technology generated 194 notifications, drawing nurses to the beside for patient reassessment. Forty-two daily nurse reports and 926 virtual nursing care records were audited to determine technology implementation issues and nursing actions to support patient recovery at home. Process monitoring uncovered strengths and limitations in the initial days of intervention deployment. Strengths included the functionality of RAM technology, facilitating nurse compliance with required workflows, as well as a high degree of patient engagement in the program. SMArTVIEW nurses addressed multiple health concerns for patients, resulting in 1,865 nursing actions over the 30-day intervention course. Patient withdrawals and lack of standardized communication practices were areas requiring improvement. Results were used to refine and standardize intervention workflows in order to scale the intervention for deployment at a second site (United Kingdom). / Thesis / Master of Science (MSc)
79

Analysis and Evaluation of Social Network Anomaly Detection

Zhao, Meng John 27 October 2017 (has links)
As social networks become more prevalent, there is significant interest in studying these network data, the focus often being on detecting anomalous events. This area of research is referred to as social network surveillance or social network change detection. While there are a variety of proposed methods suitable for different monitoring situations, two important issues have yet to be completely addressed in network surveillance literature. First, performance assessments using simulated data to evaluate the statistical performance of a particular method. Second, the study of aggregated data in social network surveillance. The research presented tackle these issues in two parts, evaluation of a popular anomaly detection method and investigation of the effects of different aggregation levels on network anomaly detection. / Ph. D.
80

Multiscale process monitoring with singular spectrum analysis

Krishnannair, Syamala 12 1900 (has links)
Thesis (MScEng (Process Engineering))--University of Stellenbosch, 2010. / Thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Engineering (Extractive Metallurgy) In the Department of Process Engineering at the University of Stellenbosch / ENGLISH ABSTRACT: Multivariate statistical process control (MSPC) approaches are now widely used for performance monitoring, fault detection and diagnosis in chemical processes. Conventional MSPC approaches are based on latent variable projection methods such as principal component analysis and partial least squares. These methods are suitable for handling linearly correlated data sets, with minimal autocorrelation in the variables. Industrial plant data invariably violate these conditions, and several extensions to conventional MSPC methodologies have been proposed to account for these limitations. In practical situations process data usually contain contributions at multiple scales because of different events occurring at different localizations in time and frequency. To account for such multiscale nature, monitoring techniques that decompose observed data at different scales are necessary. Hence the use of standard MSPC methodologies may lead to unreliable results due to false alarms and significant loss of information. In this thesis a multiscale methodology based on the use of singular spectrum analysis is proposed. Singular spectrum analysis (SSA) is a linear method that extracts information from the short and noisy time series by decomposing the data into deterministic and stochastic components without prior knowledge of the dynamics affecting the time series. These components can be classified as independent additive time series of slowly varying trend, periodic series and aperiodic noise. SSA does this decomposition by projecting the original time series onto a data-adaptive vector basis obtained from the series itself based on principal component analysis (PCA). The proposed method in this study treats each process variable as time series and the autocorrelation between the variables are explicitly accounted for. The data-adaptive nature of SSA makes the proposed method more flexible than other spectral techniques using fixed basis functions. Application of the proposed technique is demonstrated using simulated, industrial data and the Tennessee Eastman Challenge process. Also, a comparative analysis is given using the simulated and Tennessee Eastman process. It is found that in most cases the proposed method is superior in detecting process changes and faults of different magnitude accurately compared to classical statistical process control (SPC) based on latent variable methods as well as the wavelet-based multiscale SPC. / AFRIKAANSE OPSOMMING: Meerveranderlike statistiese prosesbeheerbenaderings (MSPB) word tans wydverspreid benut vir werkverrigtingkontrolering, foutopsporing en .diagnose in chemiese prosesse. Gebruiklike MSPB word op latente veranderlike projeksiemetodes soos hoofkomponentontleding en parsiele kleinste-kwadrate gebaseer. Hierdie metodes is geskik om lineer gekorreleerde datastelle, met minimale outokorrelasie, te hanteer. Nywerheidsaanlegdata oortree altyd hierdie voorwaardes, en verskeie MSPB is voorgestel om verantwoording te doen vir hierdie beperkings. Prosesdata afkomstig van praktiese toestande bevat gewoonlik bydraes by veelvuldige skale, as gevolg van verskillende gebeurtenisse wat by verskillende lokaliserings in tyd en frekwensie voorkom. Kontroleringsmetodes wat waargenome data ontbind by verskillende skale is nodig om verantwoording te doen vir sodanige multiskaalgedrag. Derhalwe kan die gebruik van standaard-MSPB weens vals alarms en beduidende verlies van inligting tot onbetroubare resultate lei. In hierdie tesis word . multiskaalmetodologie gebaseer op die gebruik van singuliere spektrumontleding (SSO) voorgestel. SSO is . lineere metode wat inligting uit die kort en ruiserige tydreeks ontrek deur die data in deterministiese en stochastiese komponente te ontbind, sonder enige voorkennis van die dinamika wat die tydreeks affekteer. Hierdie komponente kan as onafhanklike, additiewe tydreekse geklassifiseer word: stadigveranderende tendense, periodiese reekse en aperiodiese geruis. SSO vermag hierdie ontbinding deur die oorspronklike tydreeks na . data-aanpassende vektorbasis te projekteer, waar hierdie vektorbasis verkry is vanaf die tydreeks self, gebaseer op hoofkomponentontleding. Die voorgestelde metode in hierdie studie hanteer elke prosesveranderlike as . tydreeks, en die outokorrelasie tussen veranderlikes word eksplisiet in berekening gebring. Aangesien die SSO metode aanpas tot data, is die voorgestelde metode meer buigsaam as ander spektraalmetodes wat gebruik maak van vaste basisfunksies. Toepassing van die voorgestelde tegniek word getoon met gesimuleerde prosesdata en die Tennessee Eastman-proses. . Vergelykende ontleding word ook gedoen met die gesimuleerde prosesdata en die Tennessee Eastman-proses. In die meeste gevalle is dit gevind dat die voorgestelde metode beter vaar om prosesveranderings en .foute met verskillende groottes op te spoor, in vergeleke met klassieke statistiese prosesbeheer (SP) gebaseer op latente veranderlikes, asook golfie-gebaseerde multiskaal SP.

Page generated in 0.1371 seconds