• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 61
  • 12
  • 5
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 112
  • 112
  • 23
  • 17
  • 16
  • 15
  • 12
  • 12
  • 12
  • 11
  • 10
  • 10
  • 10
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

An Application Framework for Monitoring Care Processes

Baarah, Aladdin 17 December 2013 (has links)
Care process monitoring is important in healthcare domains to provide precise and detailed analytics on patients, providers, and resources participating in a care process and their status. These analytics are used to keep track of whether the quality of care goals set by healthcare organizations are satisfied and ensure that legislative and organizational guidelines are followed. The complexity of care process monitoring can vary depending on whether the care process takes place in a hospital or out in the community, and it can vary depending on the complexity of the information technology infrastructure that is in place to support the care process. A Care Process Monitoring Application (CPMA) is a software application which collects and integrates data from various sources while a care process is being provided, in order to provide performance reporting of metrics that are used to measure how well the performance goals and guidelines for the care process are being met. In our research, we have studied how CPMAs are built in order to improve the quality of their engineering. The significant challenge in this context is how to engineer a CPMA so that the engineering process is repeatable, produces a CPMA of consistent high quality, and requires less time, less effort and less complexity. This thesis proposes an application framework for care process monitoring that collects and integrates events from event sources, maintains the individual and aggregate states of the care process and populates a metrics data mart to support performance reporting. Our contributions are the following: a state-based application meta-model of care process monitoring, a care process monitoring architectural pattern, and finally, a behavior driven development methodology for CPMAs based on our meta-model and architectural pattern. Our results are validated through three different case studies in which we collaborated with two different health care organizations to build and deploy CPMAs for two different care processes (one hospital-based, the other community-based) in collaboration with healthcare clinicians and researchers.
102

Modélisation et suivi du procédé par infusion de résine sur une nouvelle génération de renforts structuraux pour l’aéronautique / Modeling and in situ monitoring of the resin infusion process, using a new generation of reinforcement, for aeronautic primary parts applications

Blais, Maxime 16 March 2016 (has links)
Cette étude porte sur la mise en œuvre par infusion d'une nouvelle génération de renfort spécialement dédiée aux procédés LCM (Liquid Composite Molding) pour la réalisation de pièces composites. Par l’expérimentation et la simulation numérique, l’objectif de ces travaux est de comprendre et maîtriser les paramètres de fabrication afin de définir des outils de modélisation et de simulation représentatifs voir prédictifs du procédé. La caractérisation des paramètres du procédé et de la santé matière finale des pièces ont ainsi permis de définir l'échelle de modélisation et les phénomènes physiques devant être simulés. La stratégie numérique adoptée repose sur un couplage fort entre les équations de Stokes et de Darcy dans un cadre de grandes transformations et où les interfaces sont représentées par des Level-Set. Ce couplage modélise l'écoulement dans la préforme fibreuse assimilée à un milieu poreux homogène équivalent (modèle de Darcy) et l'écoulement dans les drainants (modèle de Stokes). Un modèle de Terzaghi réalise le couplage fluide-solide en représentant l'action du fluide sur le renfort via sa pression hydrostatique. L’évolution du taux volumique de fibres et de la perméabilité du milieu sont ainsi actualisés dans les différents problèmes. La confrontation des simulations numériques aux caractérisations expérimentales mettent en évidence qu'un effort important doit être réalisé dans la compréhension, la définition et la caractérisation de la perméabilité. Sur les renforts de l'étude, la question même de la représentation et de l'homogénéisation d'écoulements locaux complexes à travers la seule notion de perméabilité est à poser. / This study deals with the manufacturing by infusion of a new high performance fibrous reinforcement solution developed for LCM (Liquid Composite Molding) composite materials manufacturing processes. Combining simulations with experimentations, the aim of this work is to understand and control the process parameters in order to develop accurate simulation tools. Involving specific experimental plan and protocols combined with in-situ monitoring technologies this study points out the physics and length scales challenging the process simulation. The numerical strategy considered in this works is based on a strong coupling between a Stokes and Darcy model undergoing large strains and where interfaces are represented and managed by a Level-Set method. At the process scale, the fluid mechanics problem describes the resin flow through the distribution medium and then through fibrous preforms assimilated to porous medium. A key feature of our approach is the fluid-solid interaction leading to couple a fluid/porous flow with a non-linear solid mechanic problem. The interaction phenomenon due to the resin flow in the orthotropic highly compressible preform is based on both Terzaghi’s law and explicit relations expressing permeability as function of porosity. Some numerical simulations are presented and compared to the experimental characterizations. The results point out the the first necessity to work on the permeability comprehension, definition and characterization. Applied to the specific reinforcements of the study, the representation of the complex local flows and their homogenization through this single permeability notion can also be questionable.
103

Detecção de situações anormais em caldeiras de recuperação química. / Detection of abnormal situations in chemical recovery boilers.

Gustavo Matheus de Almeida 12 September 2006 (has links)
O desafio para a área de monitoramento de processos, em indústrias químicas, ainda é a etapa de detecção, com a necessidade de desenvolvimento de sistemas confiáveis. Pode-se resumir que um sistema é confiável, ao ser capaz de detectar as situações anormais, de modo precoce, e, ao mesmo tempo, de minimizar a geração de alarmes falsos. Ao se ter um sistema confiável, pode-se empregá-lo para auxiliar o operador, de fábricas, no processo de tomada de decisões. O objetivo deste estudo é apresentar uma metodologia, baseada na técnica, modelo oculto de Markov (HMM, acrônimo de ?Hidden Markov Model?), para se detectar situações anormais em caldeiras de recuperação química. As aplicações de maior sucesso de HMM são na área de reconhecimento de fala. Pode-se citar como aspectos positivos: o raciocínio probabilístico, a modelagem explícita, e a identificação a partir de dados históricos. Fez-se duas aplicações. O primeiro estudo de caso é no ?benchmark? de um sistema de evaporação múltiplo efeito de uma fábrica de produção de açúcar. Identificou-se um HMM, característico de operação normal, para se detectar cinco situações anormais no atuador responsável por regular o fluxo de xarope de açúcar para o primeiro evaporador. A detecção, para as três situações abruptas, é imediata, uma vez que o HMM foi capaz de detectar alterações, abruptas, no sinal da variável monitorada. Em relação às duas situações incipientes, foi possível detectá-las ainda em estágio inicial; ao ser o valor de f (vetor responsável por representar a intensidade de um evento anormal, com o tempo), no instante da detecção, próximo a zero, igual a 2,8% e 2,1%, respectivamente. O segundo estudo de caso é em uma caldeira de recuperação química, de uma fábrica de produção de celulose, no Brasil. O objetivo é monitorar o acúmulo de depósitos de cinzas sobre os equipamentos da sessão de transferência de calor convectivo, através de medições de perda de carga. Este é um dos principais desafios para se aumentar a eficiência operacional deste equipamento. Após a identificação de um HMM característico de perda de carga alta, pôde-se verificar a sua capacidade de informar o estado atual e, por consequência, a tendência do sistema, de modo similar à um preditor. Pôde-se demonstrar também a utilidade de se definir limites de controle, com o objetivo de se ter a informação sobre a distância entre o estado atual e os níveis de alarme de perda de carga. / The greatest challenge faced by the area of process monitoring in chemical industries still resides in the fault detection task, which aims at developing reliable systems. One may say that a system is reliable if it is able to perform early fault detection and, at the same time, to reduce the generation of false alarms. Once there is a reliable system available, it can be employed to help operators, in factories, in the decisionmaking process. The aim of this study is presenting a methodology, based on the Hidden Markov Model (HMM) technique, suggesting its use in the detection of abnormal situations in chemical recovery boilers. The most successful applications of HMM are in the area of speech recognition. Some of its advantages are: probabilistic reasoning, explicit modeling and the identification based on process history data. This study discusses two applications. The first one is on a benchmark of a multiple evaporation system in a sugar factory. A HMM representative of the normal operation was identified, in order to detect five abnormal situations at the actuator responsible for controlling the syrup flow to the first evaporator. The detection result for the three abrupt situations was immediate, since the HMM was capable of detecting the statistical changes on the signal of the monitored variable as soon as they occurred. Regarding to the two incipient situations, the detection was done at an early stage. For both events, the value of vector f (responsible for representing the strength of an abnormal event over time), at the time it occurred, was near zero, equal to 2.8 and 2.1%, respectively. The second case study deals with the application of HMM in a chemical recovery boiler, belonging to a cellulose mill, in Brazil. The aim is monitoring the accumulation of ash deposits over the equipments of the convective heat transfer section, through pressure drop measures. This is one of the main challenges to be overcome nowadays, bearing in mind the interest that exists in increasing the operational efficiency of this equipment. Initially, a HMM for high values of pressure drop was identified. With this model, it was possible to check its capacity to inform the current state, and consequently, the tendency of the system (similarly as a predictor). It was also possible to show the utility of defining control limits, in order to inform the operator the relative distance between the current state of the system and the alarm levels of pressure drop.
104

Bioprocess software sensors development facing modelling and model uncertainties / Développement de capteurs logiciels pour les bioprocédés face aux incertitudes de modélisation et de modèle

Hulhoven, Xavier 07 December 2006 (has links)
The exponential development of biotechnology has lead to a quasi unlimited number of potential products going from biopolymers to vaccines. Cell culture has therefore evolved from the simple cell growth outside its natural environment to its use to produce molecules that they do not naturally produce. This rapid development could not be continued without new control and supervising tools as well as a good process understanding. This requirement involves however a large diversity and a better accessibility of process measurements. In this framework, software sensors show numerous potentialities. The objective of a software sensor is indeed to provide an estimation of the system state variables and particularly those which are not obtained through in situ hardware sensors or laborious and expensive analysis. In this context, This work attempts to join the knowledge of increasing bioprocess complexity and diversity and the time scale of process developments and favours systematic modelling methodology, its flexibility and the speed of development. In the field of state observation, an important modelling constraint is the one induced by the selection of the state to estimate and the available measurements. Another important constraint is the model quality. The central axe of this work is to provide solutions in order to reduce the weight of these constraints to software sensors development. On this purpose, we propose four solutions to four main questions that may arise. The first two ones concern modelling uncertainties.<p><p>1."How to develop a software sensor using measurements easily available on pilot scale bioreactor?" The proposed solution is a static software sensor using an artificial neural network. Following this modelling methodology we developed static software sensors for the biomass and ethanol concentrations in a pilot scale S. cerevisae cell culture using the measurement of titrating base quantity, agitation rate and CO& / Doctorat en sciences agronomiques et ingénierie biologique / info:eu-repo/semantics/nonPublished
105

An Application Framework for Monitoring Care Processes

Baarah, Aladdin January 2014 (has links)
Care process monitoring is important in healthcare domains to provide precise and detailed analytics on patients, providers, and resources participating in a care process and their status. These analytics are used to keep track of whether the quality of care goals set by healthcare organizations are satisfied and ensure that legislative and organizational guidelines are followed. The complexity of care process monitoring can vary depending on whether the care process takes place in a hospital or out in the community, and it can vary depending on the complexity of the information technology infrastructure that is in place to support the care process. A Care Process Monitoring Application (CPMA) is a software application which collects and integrates data from various sources while a care process is being provided, in order to provide performance reporting of metrics that are used to measure how well the performance goals and guidelines for the care process are being met. In our research, we have studied how CPMAs are built in order to improve the quality of their engineering. The significant challenge in this context is how to engineer a CPMA so that the engineering process is repeatable, produces a CPMA of consistent high quality, and requires less time, less effort and less complexity. This thesis proposes an application framework for care process monitoring that collects and integrates events from event sources, maintains the individual and aggregate states of the care process and populates a metrics data mart to support performance reporting. Our contributions are the following: a state-based application meta-model of care process monitoring, a care process monitoring architectural pattern, and finally, a behavior driven development methodology for CPMAs based on our meta-model and architectural pattern. Our results are validated through three different case studies in which we collaborated with two different health care organizations to build and deploy CPMAs for two different care processes (one hospital-based, the other community-based) in collaboration with healthcare clinicians and researchers.
106

Online-Überwachung der Blechbearbeitung von Bipolarplatten

Müller, Jan, Praedicow, Michael 25 November 2019 (has links)
Brennstoffzellen werden durch Stapelung präziser Bipolarplatten mit komplexer Struktur von Fließkanälen für Flüssigkeiten und Leitungen für den Transfer von Gasen hergestellt. Die für derartige Platten notwendigen Bleche werden durch Stanzen und Umformen erzeugt. Dabei haben neben der Einrichtung von Presse und Werkzeug (z. B. Hubzahl, Stößelverstellung) auch deren Zustand Einfluss auf die Qualität der Platten (Ebenheit, Materialspannungen/Rissbildungen, Gratbildung). Obwohl moderne Pressen heute im Wesentlichen auf servoelektrischen Antrieben basieren, erfolgt die Kraftübertragung auf den Stößel vornehmlich über mehrere Antriebsdruckpunkte. Spezifische Schneide- und Umformwerkzeuge erzeugen auf einer entsprechenden Presse bei definierten Presseneinstellungen und Materialparametern spezifische Kraft-, Moment- und Kippungsverläufe. Die am IWU entwickelte Überwachungslösung ist in der Lage, die genannten Größen hubabhängig zu erfassen. Mittels eines im IO-Zustand von Presse und Werkzeug generierten „Fingerprints“ werden im laufenden Prozess gemessene Abweichungen erkannt und in Echtzeit analysiert. Dies ist insbesondere beim Einsatz dünnster Bleche von enormer Bedeutung, da hier bereits kleinste anlagenbedingte Änderungen zu kostenintensiven Ausfällen an Werkzeug und Produkt führen können.
107

Design of side-sensitive double sampling control schemes for monitoring the location parameter

Motsepa, Collen Mabilubilu 06 1900 (has links)
Double sampling procedure is adapted from a statistical branch called acceptance sampling. The first Shewhart-type double sampling monitoring scheme was introduced in the statistical process monitoring (SPM) field in 1974. The double sampling monitoring scheme has been proven to effectively decrease the sampling effort and, at the same time, to decrease the time to detect potential out-of-control situations when monitoring the location, variability, joint location and variability using univariate or multivariate techniques. Consequently, an overview is conducted to give a full account of all 76 publications on double sampling monitoring schemes that exist in the SPM literature. Moreover, in the review conducted here, these are categorized and summarized so that any research gaps in the SPM literature can easily be identified. Next, based on the knowledge gained from the literature review about the existing designs for monitoring the process mean, a new type of double sampling design is proposed. The new charting region design lead to a class of a control charts called a side-sensitive double sampling (SSDS) monitoring schemes. In this study, the SSDS scheme is implemented to monitor the process mean when the underlying process parameters are known as well as when they are unknown. A variety of run-length properties (i.e., the 5th, 25th, 50th, 75th, 95th percentiles, the average run-length (𝐴𝑅𝐿), standard deviation of the run-length (𝑆𝐷𝑅𝐿), the average sample size (𝐴𝑆𝑆) and the average extra quadratic loss (𝐴𝐸𝑄𝐿) metrics) are used to design and implement the new SSDS scheme. Comparisons with other established monitoring schemes (when parameters are known and unknown) indicate that the proposed SSDS scheme has a better overall performance. Illustrative examples are also given to facilitate the real-life implementation of the proposed SSDS schemes. Finally, a list of possible future research ideas is given with hope that this will stimulate more future research on simple as well as complex double sampling schemes (especially using the newly proposed SSDS design) for monitoring a variety of quality characteristics in the future. / Statistics / M. Sc. (Statistics)
108

3D shape measurements with a single interferometric sensor for insitu lathe monitoring

Kuschmierz, R., Huang, Y., Czarske, J., Metschke, S., Löffler, F., Fischer, A. 29 August 2019 (has links)
Temperature drifts, tool deterioration, unknown vibrations as well as spindle play are major effects which decrease the achievable precision of computerized numerically controlled (CNC) lathes and lead to shape deviations between the processed work pieces. Since currently no measurement system exist for fast, precise and insitu 3d shape monitoring with keyhole access, much effort has to be made to simulate and compensate these effects. Therefore we introduce an optical interferometric sensor for absolute 3d shape measurements, which was integrated into a working lathe. According to the spindle rotational speed, a measurement rate of 2,500 Hz was achieved. In-situ absolute shape, surface profile and vibration measurements are presented. While thermal drifts of the sensor led to errors of several µm for the absolute shape, reference measurements with a coordinate machine show, that the surface profile could be measured with an uncertainty below one micron. Additionally, the spindle play of 0.8 µm was measured with the sensor.
109

Optimisation of methane production from anaerobically digested cow slurry using mixing regime and hydraulic retention time

Hughes, Kevin Lewis William January 2015 (has links)
AD is regarded as a sustainable technology that could assist the UK Government meet internationally agreed GHG emission targets by 2050. However, the mature status of the technology is based on expensive systems that rely on high energy feedstock to be profitable. Meanwhile, the natural biodegradation of cow slurry is a recognised contributor to climate change despite having a relatively low CH4 potential because of the large volumes produced. Economic mixing is essential to the cost-effectiveness of farm AD but techniques applied are not always appropriate as slurry is a shear thinning thixotropic Herschel-Bulkley fluid and therefore challenging to mix. The apparent viscosity of slurry and the shear stress induced was most influenced by solids content (exponential change) followed by temperature (linear). Most shear thinning occurred before a rising shear rate of 20s-1 was achieved with the fluid acting near-Newtonian above. Thixotropic recovery occurred within 1 hour of resting. Rheological values were also much higher than previously reported. Highest CH4 production occurred in the first 10 days of the batch process using a range of mixing regimes with different shear rates and rest periods. During fed-batch operations, changing shear rate had a minimal effect on CH4 production using a 30-day HRT whereas shorter rest periods increased production. Specific CH4 production rate was highest when feeding and mixing coincided. However, when HRT was reduced (OLR increased) the CH4 produced by all mixed regimes significantly increased with highest values being achieved using high intensity mixing rested for short periods. Lower HRTs also requires smaller digesters. Parasitic mixing energy invariably had the most influence on net energy production. Signs of instability were evident after 20 days using the low HRT. Significant microbial adaptation was also observed as the experiments progressed. The research outcomes demonstrate that mixing regime and HRT can be managed to maximise net energy production whilst reducing capital expenditure.
110

Novel beam shaping and computer vision methods for laser beam welding

Mi, Yongcui January 2021 (has links)
Laser beam welding has been widely applied in different industrial sectors due to its unique advantages. However, there are still challenges, such as beam positioning in T-joint welding, and gap bridging in butt joint welding,especially in the case of varying gap width along a joint. It is expected that enabling more advanced control to a welding system, and obtaining more in-depth process knowledge could help to solve these issues. The aim of this work is to address such welding issues by a laser beam shaping technology using a novel deformable mirror together with computer vision methods and also to increase knowledge about the benefits and limitations with this approach. Beam shaping in this work was realized by a novel deformable mirror system integrated into an industrial processing optics. Together with a wave front sensor, a controlled adaptive beam shaping system was formed with a response time of 10 ms. The processes were monitored by a coaxial camera with selected filters and passive or active illumination. Conduction mode autogenous bead-on-plate welding and butt joint welding experiments have been used to understand the effect of beam shaping on the melt pool geometry. Circular Gaussian, and elliptical Gaussian shapes elongated transverse to and along the welding direction were studied. In-process melt pool images and cross section micrographs of the weld seams/beads were analyzed. The results showed that the melt pool geometry can be significantly modified by beam shaping using the deformable mirror. T-joint welding with different beam offset deviations relative to the center of the joint line was conducted to study the potential of using machine learning to track the process state. The results showed that machine learning can reach sufficient detection and estimation performance, which could also be used for on-line control. In addition, in-process and multidimensional data were accurately acquired using computer vision methods. These data reveal weaknesses of current thermo-fluid simulation model, which in turn can help to better understand and control laser beam welding. The obtained results in this work shows a huge potential in using the proposed methods to solve relevant challenges in laser beam welding. / Lasersvetsning används i stor utsträckning i olika industrisektorer på grund av dess unika fördelar. Det finns emellertid fortfarande utmaningar, såsom rätt positionering av laserstrålen vid genomträngningssvetsning av T-fogar och hantering av varierande spaltbredd längs fogen vid svetsning av stumfogar. Sådana problem förväntas kunna lösas med avancerade metoder för automatisering, metoder som också förväntas ge fördjupade kunskaper om processen. Syftet med detta arbete är att ta itu med dessa problem med hjälp av en teknik för lasereffektens fördelning på arbetsstycket, s.k. beam shaping. Det sker med hjälp av en ny typ av i realtid deformerbar spegel tillsammans med bildbehandling av kamerabilder från processen. För- och nackdelar med detta tillvägagångssätt undersöks.Beam shaping åstadkoms med hjälp av ny typ av deformerbart spegelsystem som integreras i en industriell processoptik. Tillsammans med en vågfrontsensor bildas ett adaptivt system för beam shaping med en svarstid på 10 ms. Processen övervakas av en kamera linjerad koaxialt med laserstrålen. För att kunna ta bilder av svetspunkten belyses den med ljus av lämplig våglängd, och kameran är försedd med ett motsvarande optiskt filter. Försök har utförts med svetsning utan tillsatsmaterial, direkt på plåtar, svetsning utan s.k. nyckelhål, för att förstå effekten av beam shaping på svetssmältans geometri. Gauss fördelade cirkulära och elliptiska former, långsträckta både tvärs och längs svetsriktningen har studerats. Bilder från svetssmältan har analyserats och även mikrostrukturen i tvärsnitt från de svetsade plåtarna. Resultaten visar att svetssmältans geometri kan modifieras signifikant genom beam shaping med hjälp av det deformerbara spegelsystemet. Genomträngningssvetsning av T-fogar med avvikelser relativt foglinjens centrum genomfördes för att studera potentialen i att använda maskininlärning för att fånga processens tillstånd. Resultaten visade att maskininlärning kan nå tillräcklig prestanda för detektering och skattning av denna avvikelse. Något som också kan användas för återkopplad styrning. Flerdimensionell processdata har samlats i realtid och analyserats med hjälp av bildbehandlingsmetoder.  Dessa data avslöjar brister i nuvarande simuleringsmodeller,vilket i sin tur hjälper till med att bättre förstå och styra lasersvetsning.Resultaten från detta arbete uppvisar en god potential i att använda de föreslagna metoderna för att lösa relevanta utmaningar inom lasersvetsning. / <p>Till licentiatuppsats hör 2 inskickade artiklar, som visas inte nu.</p>

Page generated in 0.1117 seconds