• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 8
  • 7
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 64
  • 64
  • 64
  • 12
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Integrated System Model Reliability Evaluation and Prediction for Electrical Power Systems: Graph Trace Analysis Based Solutions

Cheng, Danling 14 October 2009 (has links)
A new approach to the evaluation of the reliability of electrical systems is presented. In this approach a Graph Trace Analysis based approach is applied to integrated system models and reliability analysis. The analysis zones are extended from the traditional power system functional zones. The systems are modeled using containers with iterators, where the iterators manage graph edges and are used to process through the topology of the graph. The analysis provides a means of computationally handling dependent outages and cascading failures. The effects of adverse weather, time-varying loads, equipment age, installation environment, operation conditions are considered. Sequential Monte Carlo simulation is used to evaluate the reliability changes for different system configurations, including distributed generation and transmission lines. Historical weather records and loading are used to update the component failure rates on-the-fly. Simulation results are compared against historical reliability field measurements. Given a large and complex plant to operate, a real-time understanding of the networks and their situational reliability is important to operational decision support. This dissertation also introduces using an Integrated System Model in helping operators to minimize real-time problems. A real-time simulation architecture is described, which predicts where problems may occur, how serious they may be, and what is the possible root cause. / Ph. D.
52

Modelling of in-situ real-time monitoring of catalysed biodiesel production from sunflower oil using fourier transform infrared

Mwenge, Pascal Kilunji 10 1900 (has links)
M. Tech. (Department of Chemical Engineering, Faculty of Engineering and Technology), Vaal University of Technology. / The industrialisation of the twenty-first century and the worldwide population growth led to the high demand for energy. Fossil fuels are the leading contributor to the global energy, and subsequently, there is a high demand of fuels. The decrease of global fossil fuels and the environmental air pollution caused by these fuels are concerning. Therefore, eco-friendly and renewable fuel such as biodiesel is one the leading alternative. Chromatography and Spectroscopy are the most used analytical methods and proven reliable but are time-consuming, requires qualified personal, extensive samples preparation, costly and do not provide in-situ real-time monitoring. Fourier Transform Infrared (FTIR) has mainly been used for qualitative analysis of biodiesel, but not much work has been reported in real-time monitoring. This study focused on the modelling of in-situ real-time monitoring of the biodiesel production from sunflower oil using FTIR (Fourier Transform Infrared). The first part of the study investigated the effect of catalyst ratio and methanol to oil ratio on biodiesel production by using central composite design (CCD). Biodiesel was produced by transesterification using Sodium Hydroxide as a homogeneous catalyst. A laboratory-scale reactor consisting of; flat bottom flask mounted with a reflux condenser, a hot plate as heating element equipped with temperature, timer and stirring rate regulator was used. Key parameters including, time, temperature and mixing rate, were kept constant at 60 minutes, 60 oC and 600 RPM, respectively. From the results obtained, it was observed that the biodiesel yield depends on catalyst ratio and methanol to oil ratio. The highest yield of 50.65 % was obtained at a catalyst ratio of 0.5 wt% and methanol to oil mole ratio 10.5. The analysis of variances of biodiesel yield showed the R2 value of 0.8387. A quadratic mathematical model was developed to predict the biodiesel yield in the specified parameters range. The same set-up was used to produce waste margarine biodiesel using a homogeneous catalyst, potassium hydroxide (KOH). The effects of four reaction parameters were studied, these were: methanol to oil ratio (3:1 to 15:1), catalyst ratio (0.3 to 1.5 wt. %), temperature (30 to 70 oC), time (20 to 80 minutes). The highest yield of 91.13 % was obtained at 60°C reaction temperature, 9:1 methanol to oil molar ratio, 0.9 wt. % catalyst ratio and 60 minutes. The important biodiesel fuel properties were found to be within specifications of the American Standard Test Method specifications (ASTM). It was concluded that waste margarine can be used to produce biodiesel as a low-cost feedstock. The core of the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of the biodiesel production was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. Fourteen samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centring and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 values of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against the univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor transesterification of sunflower oil at industrial and lab scale. The model thus obtained, a batch reactor setup, EasyMax Mettler Toledodo reactor was used, the experiments were designed and monitored using iControl software. The results were recorded and quantified using iC IR software based on the biodiesel calibrated monitoring model built. The optimisation of the biodiesel was performed using three key parameters (methanol to oil ratio, catalyst ratio and temperature) while keeping time at 60 minutes and mixing rate at 150RPM. The highest yield of 97.85 % was obtained at 60 oC, 0.85 wt % catalyst ratio and 10.5 methanol to oil mole ratio. The analysis of variances of biodiesel production showed the values of 0.9847, 0.9674 and 0.8749, for R-squared, adjusted R-squared and predicted R-squared, respectively. A quadratic mathematical model was developed to predict the biodiesel conversion in the specified parameters ranges. Using the Arrhenius equation, activation energy (Ea) and frequency factor were found to be 41.279 kJ.mole-1 and 1.08 x10-4 M-1. s-1, respectively. The proposed kinetics model was a pseudo-first-order reaction. It was concluded that the model obtained can be used for industrial and laboratory-scale biodiesel production monitoring.
53

Estimação de demanda trifásica em tempo real para sistemas de distribuição radiais / Real time three-phase load estimation for radial distribution feeders

Avelino, Luana Locatelli 28 June 2018 (has links)
Os estimadores de estado são ferramentas primordiais para monitoramento em tempo real dos sistemas elétricos de potência, principalmente por permitir a execução de funções básicas relacionadas à segurança destes sistemas. No entanto, nos sistemas de distribuição, por possuírem uma série de particularidades e pelo reduzido número de medidas disponíveis em tempo real, os estimadores de estado convencionais desenvolvidos para sistemas de transmissão não são capazes de oferecer boas estimativas para determinação do estado da rede. Neste contexto, o Estimador de Demanda Trifásico em Tempo Real (EDTTR) desenvolvido neste trabalho, fundamentado nos métodos de estimação de demanda, tem como foco principal proporcionar maior precisão ao monitoramento da rede primária de alimentadores de distribuição radiais, ao contemplar as características intrínsecas do modelo da rede. O EDTTR consiste em dois estágios principais: o primeiro é caracterizado pela estimação off-line das demandas dos transformadores de distribuição por meio do processo de agregação de cargas, considerando informações dos consumidores. O segundo estágio é realizado em tempo real, utiliza um algoritmo eficiente de Varredura Direta Inversa para a solução do fluxo de carga, auxiliado da estrutura de dados chamada Representação Nó-Profundidade para armazenar a topologia da rede. Neste estágio também ocorre o ajuste recursivo das estimativas obtidas no Estágio 1 com base nas poucas medidas disponíveis em tempo real. No intuito de melhorar a representação da rede primária, o EDTTR desenvolvido possibilita o tratamento dos transformadores de distribuição conectados em Delta-Yn. Esta proposição permite uma caracterização do modelo das cargas mais fiel, sendo capaz de superar problemas de convergência, além de contribuir para a qualidade do processo de estimação. Desenvolveu-se, também, uma plataforma de testes que permite a análise do impacto de fatores que deterioram o processo de estimação de demandas, como: a má classificação dos consumidores, o comportamento dos mesmos, que na prática não é exatamente igual ao da classe de carga a ele atribuída, a possibilidade de perdas não técnicas, erros grosseiros e falhas de comunicação nos medidores do alimentador. O EDTTR foi validado em simulações utilizando um alimentador real da cidade de Ribeirão Preto com base em dados fornecidos pela distribuidora CPFL Paulista. / The state estimators are primordial tools for electric power systems real time monitoring. These tools allow the execution of basic functions related to systems security. However, conventional state estimators, developed for transmission systems, can not provide good estimates to the state of distribution systems. This is due to two main factors, the few real time measures available in distribution systems, and the intrinsic complexity of them. In this context, the developed Real Time Three-Phase Load Estimator (EDTTR), based on load estimation methods, aims to provide better estimates to the monitoring of the primary network of radial distribution feeders, considering all the intrinsic sets of the system model. The EDTTR consists of two main stages: the first is characterized by the off-line load estimation of the distribution transformers by a load aggregation process, which uses consumers information as input data. In the second stage the off-line estimates obtained in Stage 1 are recursively refined in real-time basing on the available measurements. This procedure is executed by a computationally efficient backward/forward sweep load flow algorithm based on the data structure called Node-depth Encoding. The developed EDTTR allows the treatment of distribution transformers connected in Delta-Yn improving the primary network representation. A test platform is also developed that enables the analysis of the impact of factors that deteriorate the demand estimation process, such as the poor classification of consumers, the poor representativeness of the load class for certain consumers, the possibility of non-technical losses, and of gross errors and communication faults in the feeder meters. The developed EDTTR performance was tested in simulations that uses a real feeder from Ribeirão Preto city, based on system data provided by CPFL Paulista utility.
54

Desenvolvimento e implementação de um sistema de monitoramento em tempo real da tensão da rede com acesso remoto

Colnago, Guilherme Piazentini 05 October 2009 (has links)
Made available in DSpace on 2016-12-23T14:07:25Z (GMT). No. of bitstreams: 1 Texto.pdf: 1767882 bytes, checksum: f8f65214aa10171d5d8bab0355646542 (MD5) Previous issue date: 2009-10-05 / No Brasil, até pouco tempo atrás, a qualidade da energia elétrica estava relacionada, basicamente, com interrupções do fornecimento de energia e a certas cargas especiais da indústria. Porém, nos últimos anos, sob a direção da agência reguladora do setor, a ANEEL (Agência Nacional de Energia Elétrica), juntamente com especialistas, a área de qualidade da energia elétrica passou a receber uma atenção significativa, sendo legislada e adquirindo suas regulamentações iniciais. A área de Qualidade da Energia Elétrica passou, então, a formalmente existir e abranger um conjunto maior de fenômenos e eventos da rede elétrica. Em função dessa recente regulamentação, este trabalho apresenta o projeto de um medidor da qualidade da energia elétrica. Um dos focos do medidor é ser de baixo custo, tornando-o viável para o uso em grande escala. Este medidor é um sistema eletrônico que processa digitalmente os sinais de tensão da rede elétrica, extraindo os dados relacionados à qualidade da energia elétrica; tais dados são armazenados localmente e, posteriormente, acessados remotamente e enviados para um banco de dados, de forma que possam ser analisados. / Some years ago in Brazil the power quality was related, basically, with interruption of supplied energy and certain special loads of industry. In recent years, however, under controls of the regulating agency ANEEL (Agência Nacional de Energia Elétrica) and with specialists, the power quality area received the due attention and was legislated and acquired regulations. So, finally, the area of Power Quality was formally created and now it embraces several electrical phenomena and events. Because of the new regulations, this work presents a project of a power quality meter. One of meter s focuses is to be a low cost system and becomes able to be used in large scale. This power quality meter is an electronic system that processes the voltage signal of electrical network and extracts data related to power quality; the data are locally stored and after they are remotely accessed and transmitted to a data base to be analyzed.
55

Estimação de demanda trifásica em tempo real para sistemas de distribuição radiais / Real time three-phase load estimation for radial distribution feeders

Luana Locatelli Avelino 28 June 2018 (has links)
Os estimadores de estado são ferramentas primordiais para monitoramento em tempo real dos sistemas elétricos de potência, principalmente por permitir a execução de funções básicas relacionadas à segurança destes sistemas. No entanto, nos sistemas de distribuição, por possuírem uma série de particularidades e pelo reduzido número de medidas disponíveis em tempo real, os estimadores de estado convencionais desenvolvidos para sistemas de transmissão não são capazes de oferecer boas estimativas para determinação do estado da rede. Neste contexto, o Estimador de Demanda Trifásico em Tempo Real (EDTTR) desenvolvido neste trabalho, fundamentado nos métodos de estimação de demanda, tem como foco principal proporcionar maior precisão ao monitoramento da rede primária de alimentadores de distribuição radiais, ao contemplar as características intrínsecas do modelo da rede. O EDTTR consiste em dois estágios principais: o primeiro é caracterizado pela estimação off-line das demandas dos transformadores de distribuição por meio do processo de agregação de cargas, considerando informações dos consumidores. O segundo estágio é realizado em tempo real, utiliza um algoritmo eficiente de Varredura Direta Inversa para a solução do fluxo de carga, auxiliado da estrutura de dados chamada Representação Nó-Profundidade para armazenar a topologia da rede. Neste estágio também ocorre o ajuste recursivo das estimativas obtidas no Estágio 1 com base nas poucas medidas disponíveis em tempo real. No intuito de melhorar a representação da rede primária, o EDTTR desenvolvido possibilita o tratamento dos transformadores de distribuição conectados em Delta-Yn. Esta proposição permite uma caracterização do modelo das cargas mais fiel, sendo capaz de superar problemas de convergência, além de contribuir para a qualidade do processo de estimação. Desenvolveu-se, também, uma plataforma de testes que permite a análise do impacto de fatores que deterioram o processo de estimação de demandas, como: a má classificação dos consumidores, o comportamento dos mesmos, que na prática não é exatamente igual ao da classe de carga a ele atribuída, a possibilidade de perdas não técnicas, erros grosseiros e falhas de comunicação nos medidores do alimentador. O EDTTR foi validado em simulações utilizando um alimentador real da cidade de Ribeirão Preto com base em dados fornecidos pela distribuidora CPFL Paulista. / The state estimators are primordial tools for electric power systems real time monitoring. These tools allow the execution of basic functions related to systems security. However, conventional state estimators, developed for transmission systems, can not provide good estimates to the state of distribution systems. This is due to two main factors, the few real time measures available in distribution systems, and the intrinsic complexity of them. In this context, the developed Real Time Three-Phase Load Estimator (EDTTR), based on load estimation methods, aims to provide better estimates to the monitoring of the primary network of radial distribution feeders, considering all the intrinsic sets of the system model. The EDTTR consists of two main stages: the first is characterized by the off-line load estimation of the distribution transformers by a load aggregation process, which uses consumers information as input data. In the second stage the off-line estimates obtained in Stage 1 are recursively refined in real-time basing on the available measurements. This procedure is executed by a computationally efficient backward/forward sweep load flow algorithm based on the data structure called Node-depth Encoding. The developed EDTTR allows the treatment of distribution transformers connected in Delta-Yn improving the primary network representation. A test platform is also developed that enables the analysis of the impact of factors that deteriorate the demand estimation process, such as the poor classification of consumers, the poor representativeness of the load class for certain consumers, the possibility of non-technical losses, and of gross errors and communication faults in the feeder meters. The developed EDTTR performance was tested in simulations that uses a real feeder from Ribeirão Preto city, based on system data provided by CPFL Paulista utility.
56

Real time detectionof airborne fungal spores and investigations into their dynamics in indoor air

Kanaani, Hussein January 2009 (has links)
Concern regarding the health effects of indoor air quality has grown in recent years, due to the increased prevalence of many diseases, as well as the fact that many people now spend most of their time indoors. While numerous studies have reported on the dynamics of aerosols indoors, the dynamics of bioaerosols in indoor environments are still poorly understood and very few studies have focused on fungal spore dynamics in indoor environments. Consequently, this work investigated the dynamics of fungal spores in indoor air, including fungal spore release and deposition, as well as investigating the mechanisms involved in the fungal spore fragmentation process. In relation to the investigation of fungal spore dynamics, it was found that the deposition rates of the bioaerosols (fungal propagules) were in the same range as the deposition rates of nonbiological particles and that they were a function of their aerodynamic diameters. It was also found that fungal particle deposition rates increased with increasing ventilation rates. These results (which are reported for the first time) are important for developing an understanding of the dynamics of fungal spores in the air. In relation to the process of fungal spore fragmentation, important information was generated concerning the airborne dynamics of the spores, as well as the part/s of the fungi which undergo fragmentation. The results obtained from these investigations into the dynamics of fungal propagules in indoor air significantly advance knowledge about the fate of fungal propagules in indoor air, as well as their deposition in the respiratory tract. The need to develop an advanced, real-time method for monitoring bioaerosols has become increasingly important in recent years, particularly as a result of the increased threat from biological weapons and bioterrorism. However, to date, the Ultraviolet Aerodynamic Particle Sizer (UVAPS, Model 3312, TSI, St Paul, MN) is the only commercially available instrument capable of monitoring and measuring viable airborne micro-organisms in real-time. Therefore (for the first time), this work also investigated the ability of the UVAPS to measure and characterise fungal spores in indoor air. The UVAPS was found to be sufficiently sensitive for detecting and measuring fungal propagules. Based on fungal spore size distributions, together with fluorescent percentages and intensities, it was also found to be capable of discriminating between two fungal spore species, under controlled laboratory conditions. In the field, however, it would not be possible to use the UVAPS to differentiate between different fungal spore species because the different micro-organisms present in the air may not only vary in age, but may have also been subjected to different environmental conditions. In addition, while the real-time UVAPS was found to be a good tool for the investigation of fungal particles under controlled conditions, it was not found to be selective for bioaerosols only (as per design specifications). In conclusion, the UVAPS is not recommended for use in the direct measurement of airborne viable bioaerosols in the field, including fungal particles, and further investigations into the nature of the micro-organisms, the UVAPS itself and/or its use in conjunction with other conventional biosamplers, are necessary in order to obtain more realistic results. Overall, the results obtained from this work on airborne fungal particle dynamics will contribute towards improving the detection capabilities of the UVAPS, so that it is capable of selectively monitoring and measuring bioaerosols, for which it was originally designed. This work will assist in finding and/or improving other technologies capable of the real-time monitoring of bioaerosols. The knowledge obtained from this work will also be of benefit in various other bioaerosol applications, such as understanding the transport of bioaerosols indoors.
57

PREVENTING DATA POISONING ATTACKS IN FEDERATED MACHINE LEARNING BY AN ENCRYPTED VERIFICATION KEY

Mahdee, Jodayree 06 1900 (has links)
Federated learning has gained attention recently for its ability to protect data privacy and distribute computing loads [1]. It overcomes the limitations of traditional machine learning algorithms by allowing computers to train on remote data inputs and build models while keeping participant privacy intact. Traditional machine learning offered a solution by enabling computers to learn patterns and make decisions from data without explicit programming. It opened up new possibilities for automating tasks, recognizing patterns, and making predictions. With the exponential growth of data and advances in computational power, machine learning has become a powerful tool in various domains, driving innovations in fields such as image recognition, natural language processing, autonomous vehicles, and personalized recommendations. traditional machine learning, data is usually transferred to a central server, raising concerns about privacy and security. Centralizing data exposes sensitive information, making it vulnerable to breaches or unauthorized access. Centralized machine learning assumes that all data is available at a central location, which is only sometimes practical or feasible. Some data may be distributed across different locations, owned by different entities, or subject to legal or privacy restrictions. Training a global model in traditional machine learning involves frequent communication between the central server and participating devices. This communication overhead can be substantial, particularly when dealing with large-scale datasets or resource-constrained devices. / Recent studies have uncovered security issues with most of the federated learning models. One common false assumption in the federated learning model is that participants are the attacker and would not use polluted data. This vulnerability enables attackers to train their models using polluted data and then send the polluted updates to the training server for aggregation, potentially poisoning the overall model. In such a setting, it is challenging for an edge server to thoroughly inspect the data used for model training and supervise any edge device. This study evaluates the vulnerabilities present in federated learning and explores various types of attacks that can occur. This paper presents a robust prevention scheme to address these vulnerabilities. The proposed prevention scheme enables federated learning servers to monitor participants actively in real-time and identify infected individuals by introducing an encrypted verification scheme. The paper outlines the protocol design of this prevention scheme and presents experimental results that demonstrate its effectiveness. / Thesis / Doctor of Philosophy (PhD) / federated learning models face significant security challenges and can be vulnerable to attacks. For instance, federated learning models assume participants are not attackers and will not manipulate the data. However, in reality, attackers can compromise the data of remote participants by inserting fake or altering existing data, which can result in polluted training results being sent to the server. For instance, if the sample data is an animal image, attackers can modify it to contaminate the training data. This paper introduces a robust preventive approach to counter data pollution attacks in real-time. It incorporates an encrypted verification scheme into the federated learning model, preventing poisoning attacks without the need for specific attack detection programming. The main contribution of this paper is a mechanism for detection and prevention that allows the training server to supervise real-time training and stop data modifications in each client's storage before and between training rounds. The training server can identify real-time modifications and remove infected remote participants with this scheme.
58

Identifiering av anomalier i COSMIC genom analys av loggar / Identification of anomalies in COSMIC through log analysis

Al-egli, Muntaher, Zeidan Nasser, Adham January 2015 (has links)
Loggar är en viktig del av alla system, det ger en inblick i vad som sker. Att analysera loggar och extrahera väsentlig information är en av de största trenderna nu inom IT-branchen. Informationen i loggar är värdefulla resurser som kan användas för att upptäcka anomalier och hantera dessa innan det drabbar användaren. I detta examensarbete dyker vi in i grunderna för informationssökning och analysera undantagsutskrifter i loggar från COSMIC för att undersöka om det är möjligt att upptäcka anomalier med hjälp av retrospektivdata. Detta examensarbete ger även en inblick i möjligheten att visualisera data från loggar och erbjuda en kraftfull sökmotor. Därför kommer vi att fördjupa oss i de tre välkända program som adresserar frågorna i centraliserad loggning: Elasticsearch, Logstash och Kibana. Sammanfattningsvis visar resultatet att det är möjligt att upptäckta anomalier genom att tillämpa statistiska metoder både på retrospektiv- och realtidsdata. / Logs are an important part of any system; it provides an insight into what is happening. One of the biggest trends in the IT industry is analyzing logs and extracting essential information. The information in the logs are valuable resources that can be used to detect anomalies and manage them before it affects the user In this thesis we will dive into the basics of the information retrieval and analyze exceptions in the logs from COSMIC to investigate whether it is feasible to detect anomalies using retrospective data. This thesis also gives an insight into whether it’s possible to visualize data from logs and offer a powerful search engine. Therefore we will dive into the three well known applications that addresses the issues in centralized logging: Elasticsearch, Logstash and Kibana. In summary, our results shows that it’s possible to detected anomalies by applying statistical methods on both in retrospective and real time data.
59

Développement d’une plateforme immunobiologique microstructurée intégrée à un microscope plasmonique pour le diagnostic de l’inflammation en temps réel / Development of microstructured immunobiological platform integrated to a novel plasmonic microscope for real-time monitoring of inflammatory reactions

Muldur, Sinan 13 December 2016 (has links)
Dans son ensemble, les techniques de pointe actuelles procurent l'information nécessaire à une analyse approfondie de la cellule, ce qui nécessite cependant l’utilisation d’instruments et de plateformes analytiques différentes. Les biopuces à cellule permettent l’analyse des cellules vivantes en temps réel et constituent donc un outil important pour de nombreuses applications dans la recherche biomédicale telles que la toxicologie et la pharmaceutique.En effet, le suivi en temps réel de la réponse non-seulement physique mais aussi chimique des cellules, obtenue suite à des stimuli externes spécifiques et en utilisant un système d'imagerie cellulaire, peut fournir une meilleure compréhension des mécanismes et des voies de signalisation impliquées dans la réaction toxicologique.Le développement de tels dispositifs multianalytiques pour l'analyse biologique repose essentiellement sur la capacité de produire des surfaces fonctionnelles de pointe permettant une interaction et organisation contrôlée des cellules et d'autres entités telles que par exemple des anticorps ou des nanoparticules. Par conséquent, un grand effort technologique repose sur le développement des techniques permettant la création de motifs fonctionnels sur une surface de nature souvent inerte. Dans cette thèse, nous proposons deux techniques de micro- et nanofabrication permettant la création de motifs de cellules et d’anticorps sur un revêtement non-adhésif composé de poly (oxyde d'éthylène) (« PEO-like ») déposé par plasma. La première approche consiste à immobiliser par physisorption un micro-réseau de molécules adhésives de la matrice extracellulaire (par exemple la fibronectine) en utilisant des techniques d’impression par microcontact et par non-contact. La deuxième approche permet la création de motifs adhésifs sur la surface constitués de nanoparticules d'or (Au NPs) en utilisant des techniques d’impression similaire. L'immobilisation des Au NPs sur le revêtement « PEO-like » ne nécessite pas de modifications chimiques et est réalisé par une technique d'autoassemblage simple et irréversible. Ces surfaces d'or nanostructurées ont été testées pour l’analyse du phénomène de reconnaissance biomoléculaire et en tant que plateforme de culture cellulaire. Finalement, cette plateforme a été intégrée à un microscope plasmonique qui a permis, de façon préliminaire, la surveillance et la visualisation de la motilité d’une cellule unique, cela en temps réel et sans marquage, ainsi que la détection spécifique et sensible de protéines tests / State of the art techniques give as a whole the required information needed for the complete cell analysis but require different instruments and different types of platforms. The concept of cells on-a-chip allowing real-time analysis of living cells is, therefore, an important tool for many biomedical research applications such as toxicology and drug discovery. Monitoring in real-time the physical but also chemical response of live cells to specific external stimuli using live-cell imaging can provide a better understanding of the mechanisms and pathways involved in the toxicological reaction. The development of such multianalytical devices for biological analysis relies essentially on the ability to design advanced functional surfaces enabling a controlled interaction and organisation of cells and other nanostructures (e.g antibodies and nanoparticles). Therefore, a large technological effort is related on the development of advanced patterning techniques. In this thesis, we propose two simple and direct micro- and nano-fabrication techniques enabling the creation of cellular and sensing patterns on a non-adhesive and cell repellent plasma-deposited poly (ethyleneoxide) (PEO-like) coating. The first approach consists in immobilising a microarray of ECM molecules (cell-adhesive proteins, e.g fibronectin) on the cell repellent PEO-like surface by physisorption using microspotting or microcontact printing techniques. The second approach enables the creation of Gold nanoparticles (Au NPs) adhesive patterns on the surface using similar spotting techniques. The immobilization of Au NPs on PEO-like coatings does not require any prior chemical modifications and is achieved by a straightforward and irreversible self-assembly technique. These gold nanostructured surfaces have been tested for protein bio-recognition analysis and as a cell culture platform. Ultimately, this platform was integrated to a novel plasmonic microscope which enabled, preliminarily, the label-free monitoring and visualisation of a single cell attachment and detachment in real time, as well as the specific and sensitive detection of test proteins in a cell-free environment
60

Optimisation multicritère des itinéraires pour transport des marchandises dangereuses en employant une évaluation en logique floue du risque et la simulation du trafic à base d'agents / Multi-criteria route optimization for dangerous goods transport using fuzzy risk assessment and agent-based traffic simulation

Laarabi, Mohamed Haitam 15 December 2014 (has links)
Chaque jour des milliers de camions transportant des centaines de milliers de tonnes de marchandises dangereuses par diverses modalités. Toutefois, le terme “dangereux” indique une adversité intrinsèque qui caractérise ces produits transportés, et qui peuvent se manifester lors d'un accident entraînant la fuite d'une substance dangereuse. Dans une telle situation, les conséquences peuvent nuire à l'environnement et létal pour l'humain.L'importance des marchandises dangereuses revient aux bénéfices économiques considérables générés. En fait, on ne peut nier la contribution du transport des produits dérivés de combustibles fossiles, ce qui représente plus de 60% des marchandises dangereuses transportées en Europe. Eni, la société italienne leader de pétrochimie, gère chaque jour une flotte d'environ 1.500 camions, qui effectuent de nombreuses expéditions. Pourtant la distribution de produits pétroliers est une activité à grande risques, et tout accident lors du transport peut entraîner de graves conséquences.Consciente des enjeux, la division Eni R&M - Logistique Secondaire, historiquement actif au siège de Gênes, collabore depuis 2002 avec le DIBRIS à l'Université de Gênes, et le CRC à Mines ParisTech, dans le but d'étudier les améliorations possibles en matière de sûreté dans le transport de marchandises dangereuses. Au fil des ans, cette collaboration a permis le développement d'un système d'information et décisionnel. Le composant principal de ce système est une plate-forme de surveillance de la flotte Eni appelé TIP (Transport Integrated Platform), pour livrer les produits vers les points de distributions. Ces véhicules sont équipés d'un dispositif capable de transmettre des flux de données en temps réel en utilisant un modem GPRS. Les données transmises peuvent être de nature différente et contenir des informations sur l'état du véhicule, le produit et les événements détectés durant l'expédition. Ces données sont destinées à être reçues par des serveurs centralisés puis traitées et stockées, afin de soutenir diverses applications du TIP.Dans ce contexte, les études menées tout au long de la thèse sont dirigés vers le développement d'une proposition visant à réduire davantage les risques liés au transport de marchandises dangereuses. En d'autres termes, un modèle basé sur le compromis entre les facteurs économiques et sûretés pour le choix de l'itinéraire. L'objectif est motivé par la nécessité de soutenir les règlements et les normes de sécurité existantes, car ils ne garantissent pas totalement contre les accidents entrainant des marchandises dangereuses.L'objectif est effectué en prenant en compte le système existant comme base pour l'élaboration d'un système de transport intelligent (STI) regroupant plusieurs plates-formes logicielles. Ces plates-formes doivent permettre aux planificateurs et aux décideurs de suivre en temps réel leur flotte, à évaluer les risques et tous les itinéraires possibles, de simuler et de créer différents scénarios, et d'aider à trouver des solutions à des problèmes particuliers.Tout au long de cette thèse, je souligne la motivation pour ce travail de recherche, les problématiques, et les défis de transport de marchandises dangereuses. Je présente le TIP comme le noyau de l'architecture proposée du STI. Pour les besoins de la simulation, les véhicules virtuels sont injectés dans le système. La gestion de la collecte des données a été l'objet d'une amélioration technique pour plus de fiabilité, d'efficacité et d'évolutivité dans le cadre de la surveillance en temps réel. Enfin, je présente une explication systématique de la méthode d'optimisation des itinéraires considérant les critères économiques et de risques. Le risque est évalué en fonction de divers facteurs notamment la fréquence d'accidents entrainant des marchandises dangereuses, et ses conséquences. La quantification de l'incertitude dans l'évaluation des risques est modélisée en utilisant la théorie des ensembles flous. / Everyday thousands of trucks transporting hundreds of thousands of tons of dangerous goods by various modalities and both within and across nations. However, the term “dangerous” indicates an intrinsic adversity that characterize these products, which can manifest in an accident leading to release of a hazardous substance (e.g. radioactive, flammable, explosive etc.). In this situation, the consequences can be lethal to human beings, other living organisms and damage the environment and public/private properties.The importance of dangerous goods boils down to the significant economic benefits that generates. In fact, one cannot deny the contribution of the transport of all fossil fuel derived product, which represents more than 60% of dangerous goods transported in Europe. Eni, the Italian leading petrochemical company, every day operates a fleet of about 1,500 trucks, which performs numerous trips from loading terminals to filling stations. Distribution of petroleum products is a risky activity, and an accident during the transportation may lead to serious consequences.Aware of what is at stake, the division Eni R&M - Logistics Secondary, historically active in Genoa headquarters, is collaborating since 2002 with the DIBRIS department at University of Genoa, and the CRC at Mines ParisTech, with the purpose of studying possible improvements regarding safety in transport of dangerous goods, particularly petroleum products. Over years, this collaboration has led to the development of different technologies and mainly to an information and decision support system. The major component of this system is a platform for monitoring Eni fleet, at the national level, to deliver the products to the distribution points, called the Transport Integrated Platform (TIP). These vehicles are equipped with a device capable of transmitting data stream in real-time using a GPRS modem. The data transmitted can be of different nature and contain information about the state of the vehicle and occurred events during the trip. These data are intended to be received by centralized servers then get processed and stored, in order to support various applications within the TIP.With this in mind, the studies undertaken throughout the thesis are directed towards the development of a proposal to further minimize the risk related to the transportation of dangerous goods. In other words, a trade-off based model for route selection taking into consideration economic and safety factors. The objective is prompted by the need to support existent regulations and safety standards, which does not assure a full warranty against accidents involving dangerous goods.The goal is carried out by considering the existent system as basis for developing an Intelligent Transportation System (ITS) aggregating multiple software platforms. These platforms should allow planners and decision makers to monitor in real-time their fleet, to assess risk and evaluate all possible routes, to simulate and create different scenarios, and to assist at finding solutions to particular problems.Throughout this dissertation, I highlight the motivation for such research work, the related problem statements, and the challenges in dangerous goods transport. I introduce the TIP as the core for the proposed ITS architecture. For simulation purposes, virtual vehicles are injected into the system. The management of the data collection was the subject of technical improvement for more reliability, efficiency and scalability in real-time monitoring of dangerous goods shipment. Finally, I present a systematic explanation of the methodology for route optimization considering both economic and risk criteria. The risk is assessed based on various factors mainly the frequency of accident leading to hazardous substance release and its consequences. Uncertainty quantification in risk assessment is modelled using fuzzy sets theory.

Page generated in 0.501 seconds