• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 56
  • 11
  • 10
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 192
  • 192
  • 132
  • 66
  • 58
  • 42
  • 39
  • 37
  • 35
  • 32
  • 29
  • 23
  • 20
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Evaluation Of Impedance Control On A Powered Hip Exoskeleton

condoor, Punith 27 October 2017 (has links) (PDF)
This thesis presents an impedance control strategy for a novel powered hip exoskeleton designed to provide partial assistance and leverage the dynamics of human gait. The control strategy is based on impedance control and provides the user assistance as needed which is determined by the user’s interaction with the exoskeleton. A series elastic element is used to drive the exoskeleton and measures the interaction torque between the user and the device. The device operates in two modes. Free mode is a low impedance state that attempts to provide no assistance. Assist mode increases the gains of the controller to provide assistance as needed. The device was tested on five healthy subjects, and the resulting assistive hip torque was evaluated to determine the ability of the controller to provide gait assistance. The device was evaluated at different speeds to assess the gait speed adaptation performance of the controller. Results show that hip torque assistance range was between 0.3 to 0.5 Nm/kg across the subjects, corresponding to 24% to 40% of the maximum hip torque requirements of healthy adults during walking. The peak power provided by the system is 35 W on average and a peak power of up to 45 W.
112

A Magnetic Resonance Compatible Knee Extension Ergometer

Jaber, Youssef 11 July 2017 (has links) (PDF)
The product of this thesis aims to enable the study of the biochemical and physical dynamics of the lower limbs at high levels of muscle tension and fast contraction speeds. This is accomplished in part by a magnetic resonance (MR) compatible ergometer designed to apply a load as a torque of up to 420 Nm acting against knee extension at speeds as high as 4.7 rad/s. The system can also be adapted to apply the load as a force of up to 1200 N acting against full leg extension. The ergometer is designed to enable the use of magnetic resonance spectroscopy and imaging in a three Tesla Siemens Skyra MRI system. Due to the electromagnetic limitations of having the device operate inside the magnet, the design is split into two components. One designed to fit inside the 70 cm bore of the scanner. This component is electromagnetically passive; made out of materials exhibiting minimal magnetic interference, and having no electrically powered parts. The other component is electromagnetically active; it contains all of the powered elements and actuates the passive part from another room. A tensioned cable transmits power through a waveguide; a pipe through the wall of the MRI room with an RF shield. The device was tested applying a sagittal plane moment on the knee joint during isometric, isokinetic, isotonic, and constant power contractions.
113

Data Analytics for Statistical Learning

Komolafe, Tomilayo A. 05 February 2019 (has links)
The prevalence of big data has rapidly changed the usage and mechanisms of data analytics within organizations. Big data is a widely-used term without a clear definition. The difference between big data and traditional data can be characterized by four Vs: velocity (speed at which data is generated), volume (amount of data generated), variety (the data can take on different forms), and veracity (the data may be of poor/unknown quality). As many industries begin to recognize the value of big data, organizations try to capture it through means such as: side-channel data in a manufacturing operation, unstructured text-data reported by healthcare personnel, various demographic information of households from census surveys, and the range of communication data that define communities and social networks. Big data analytics generally follows this framework: first, a digitized process generates a stream of data, this raw data stream is pre-processed to convert the data into a usable format, the pre-processed data is analyzed using statistical tools. In this stage, called statistical learning of the data, analysts have two main objectives (1) develop a statistical model that captures the behavior of the process from a sample of the data (2) identify anomalies in the process. However, several open challenges still exist in this framework for big data analytics. Recently, data types such as free-text data are also being captured. Although many established processing techniques exist for other data types, free-text data comes from a wide range of individuals and is subject to syntax, grammar, language, and colloquialisms that require substantially different processing approaches. Once the data is processed, open challenges still exist in the statistical learning step of understanding the data. Statistical learning aims to satisfy two objectives, (1) develop a model that highlights general patterns in the data (2) create a signaling mechanism to identify if outliers are present in the data. Statistical modeling is widely utilized as researchers have created a variety of statistical models to explain everyday phenomena such as predicting energy usage behavior, traffic patterns, and stock market behaviors, among others. However, new applications of big data with increasingly varied designs present interesting challenges. Consider the example of free-text analysis posed above. There's a renewed interest in modeling free-text narratives from sources such as online reviews, customer complaints, or patient safety event reports, into intuitive themes or topics. As previously mentioned, documents describing the same phenomena can vary widely in their word usage and structure. Another recent interest area of statistical learning is using the environmental conditions that people live, work, and grow in, to infer their quality of life. It is well established that social factors play a role in overall health outcomes, however, clinical applications of these social determinants of health is a recent and an open problem. These examples are just a few of many examples wherein new applications of big data pose complex challenges requiring thoughtful and inventive approaches to processing, analyzing, and modeling data. Although a large body of research exists in the area of anomaly detection increasingly complicated data sources (such as side-channel related data or network-based data) present equally convoluted challenges. For effective anomaly-detection, analysts define parameters and rules, so that when large collections of raw data are aggregated, pieces of data that do not conform are easily noticed and flagged. In this work, I investigate the different steps of the data analytics framework and propose improvements for each step, paired with practical applications, to demonstrate the efficacy of my methods. This paper focuses on the healthcare, manufacturing and social-networking industries, but the materials are broad enough to have wide applications across data analytics generally. My main contributions can be summarized as follows: • In the big data analytics framework, raw data initially goes through a pre-processing step. Although many pre-processing techniques exist, there are several challenges in pre-processing text data and I develop a pre-processing tool for text data. • In the next step of the data analytics framework, there are challenges in both statistical modeling and anomaly detection o I address the research area of statistical modeling in two ways: - There are open challenges in defining models to characterize text data. I introduce a community extraction model that autonomously aggregates text documents into intuitive communities/groups - In health care, it is well established that social factors play a role in overall health outcomes however developing a statistical model that characterizes these relationships is an open research area. I developed statistical models for generalizing relationships between social determinants of health of a cohort and general medical risk factors o I address the research area of anomaly detection in two ways: - A variety of anomaly detection techniques exist already, however, some of these methods lack a rigorous statistical investigation thereby making them ineffective to a practitioner. I identify critical shortcomings to a proposed network based anomaly detection technique and introduce methodological improvements - Manufacturing enterprises which are now more connected than ever are vulnerably to anomalies in the form of cyber-physical attacks. I developed a sensor-based side-channel technique for anomaly detection in a manufacturing process / PHD / The prevalence of big data has rapidly changed the usage and mechanisms of data analytics within organizations. The fields of manufacturing and healthcare are two examples of industries that are currently undergoing significant transformations due to the rise of big data. The addition of large sensory systems is changing how parts are being manufactured and inspected and the prevalence of Health Information Technology (HIT) systems in healthcare systems is also changing the way healthcare services are delivered. These industries are turning to big data analytics in the hopes of acquiring many of the benefits other sectors are experiencing, including reducing cost, improving safety, and boosting productivity. However, there are many challenges that exist along with the framework of big data analytics, from pre-processing raw data, to statistical modeling of the data, and identifying anomalies present in the data or process. This work offers significant contributions in each of the aforementioned areas and includes practical real-world applications. Big data analytics generally follows this framework: first, a digitized process generates a stream of data, this raw data stream is pre-processed to convert the data into a usable format, the pre-processed data is analyzed using statistical tools. In this stage, called ‘statistical learning of the data’, analysts have two main objectives (1) develop a statistical model that captures the behavior of the process from a sample of the data (2) identify anomalies or outliers in the process. In this work, I investigate the different steps of the data analytics framework and propose improvements for each step, paired with practical applications, to demonstrate the efficacy of my methods. This work focuses on the healthcare and manufacturing industries, but the materials are broad enough to have wide applications across data analytics generally. My main contributions can be summarized as follows: • In the big data analytics framework, raw data initially goes through a pre-processing step. Although many pre-processing techniques exist, there are several challenges in pre-processing text data and I develop a pre-processing tool for text data. • In the next step of the data analytics framework, there are challenges in both statistical modeling and anomaly detection o I address the research area of statistical modeling in two ways: - There are open challenges in defining models to characterize text data. I introduce a community extraction model that autonomously aggregates text documents into intuitive communities/groups - In health care, it is well established that social factors play a role in overall health outcomes however developing a statistical model that characterizes these relationships is an open research area. I developed statistical models for generalizing relationships between social determinants of health of a cohort and general medical risk factors o I address the research area of anomaly detection in two ways: - A variety of anomaly detection techniques exist already, however, some of these methods lack a rigorous statistical investigation thereby making them ineffective to a practitioner. I identify critical shortcomings to a proposed network-based anomaly detection technique and introduce methodological improvements - Manufacturing enterprises which are now more connected than ever are vulnerable to anomalies in the form of cyber-physical attacks. I developed a sensor-based side-channel technique for anomaly detection in a manufacturing process.
114

Conception, fabrication et caractérisation d'un microphone MEMS / Conception, fabrication and characterization of a MEMS microphone

Czarny, Jaroslaw 27 January 2015 (has links)
Les microphones à électret dédiés à l'électronique grand public et les applications médicales (les audioprothèses) ont atteint les limites de la miniaturisation. Depuis la sortie du premier microphone basé sur une technologie microsystème sur silicium (MEMS: Micro-Electro-Mechanical Systems), les microphones à électret sont progressivement remplacés par les microphones MEMS. Les MEMS utilisent le silicium car il offre des caractéristiques mécaniques exceptionnelles avec de bonnes propriétés électriques et la technologie de fabrication est maintenant bien maîtrisée. La plupart des microphones MEMS qui sont décrits dans la littérature sont constitués d’une membrane qui vibre en dehors du plan du capteur, et utilisent la transduction capacitive. La miniaturisation de tels microphones est limitée car leur sensibilité est liée à la valeur de la capacité qui dépend de la taille de la membrane. En outre, les capteurs capacitifs sont très sensibles aux capacités parasites et aux non-linéarités. Cette thèse présente une nouvelle architecture de microphone MEMS qui utilise des micro-poutres qui vibrent dans le plan capteur. La transduction du signal est réalisée par des nanojauges piézorésistives intégrées dans le microsystème et attachées aux micro-poutres. Ce système de détection original ne présente pas les inconvénients de la détection capacitive et à la différence des piézorésistors classiques intégrés dans la membrane de silicium, les nanofils suspendus permettent d’éliminer les courants de fuite. De plus, l'amélioration de la détection est possible puisque le coefficient piézo-résistif longitudinal est inversement proportionnel à la section du nanofil. Les fluctuations de pression acoustique entraînent les déviations des micro-poutres qui produisent une concentration de contraintes dans les nanogauges. Le comportement du capteur, que l’on cherche à modéliser, est lié à des phénomènes mécaniques, acoustiques et électriques qui sont couplés. En raison des dimensions micrométriques du MEMS, les effets des dissipations thermique et visqueuse doivent être pris en compte dans le comportement acoustique. Pour prédire de façon fiable le comportement du capteur, deux modèles vibroacoustiques sont utilisés: un modèle éléments finis basé sur l'ensemble des équations de Navier-Stokes linéarisées et un modèle approché basé sur un schéma à constantes localisées (représentation par circuit électrique équivalent). Les deux modèles sont complémentaires dans le processus de conception pour déterminer la réponse en fréquence et le taux de bruit du capteur. Le travail est complété par la description des processus technologiques et les défis liés à la fabrication du prototype. Puis deux approches pour la caractérisation fonctionnelle du microphone MEMS sont présentées, la première en tube d’impédance, la seconde en champ libre. / Electret microphones dedicated to consumer electronics and medical applications (hearing aids) have reached the miniaturization limits. Since the release of the first microphone based on Silicon micromachining, electret microphones are constantly replaced by MEMS microphones. MEMS (Micro-Electro-Mechanical Systems) microphones use Silicon that provides exceptional mechanical characteristics along with good electric properties and mature fabrication technology. Regardless of the transduction principle (capacitive, piezoresistive, piezoelectric, optical), all of the MEMS microphones reported in the state of the art literature are based on a membrane deflecting out of the plane of the base wafer. Most of the reported microphones and all of the commercially available MEMS use capacitive transduction. Downscaling of capacitive microphones is problematic, since the sensitivity depends on capacitance value. Moreover capacitive sensors suffer of high sensitivity to parasitic capacitance and nonlinearity. The drawbacks of capacitive detection may be overcome with use of piezoresistive properties of Silicon nanowires. Unlike the classical piezoresistors integrated into silicon membrane, suspended nanowires do not suffer of leakage current. Further improvement of piezoresistive detection is possible since the longitudinal piezoresistive coefficient rises inversely proportional to nanowire section. This thesis presents the considerations of novel MEMS microphone architecture that uses microbeams which deflect in the plane of the base wafer. Signal transduction is achieved by piezoresistive nanogauges integrated in the microsystem and attached to the microbeams. Acoustic pressure fluctuations lead to the deflection of the microbeams which produces a stress concentration in the nanogauges. Accurate simulations of the discussed transducer couple acoustic, mechanical and electric behavior of the system. Due to micrometric dimensions of the MEMS acoustic system, thermal and viscous dissipative effects have to be taken into account. To reliably predict the sensor behavior two acoustic models are prepared: the complete Finite Element Model based on the full set of linearized Navier-Stokes equations and the approximative model based on the Lumped Elements (Equivalent Cirtuit Representation). Both models are complementary in the design process to finally retrieve the frequency response and the noise budget of the sensor. The work is completed by the description of the technological process and the challenges related to the prototype microfabrication. Then the approach to the MEMS microphone characterization in pressure-field and free-field is presented.
115

Anwendung ultrakurzer Lichtimpulse in der digital-holographischen Interferometrie

Hansel, Thomas 06 September 2010 (has links)
In dieser Arbeit wird die digital-holographisch-interferometrische Zwei-Wellenlängen-Formerfassung sehr schnell bewegter Objekte behandelt und dafür unter Nutzung einer Ultrakurzpuls-Laserquelle mit der digitalen Einzelimpuls-Mehr-Wellenlängen-Holographie ein neuartigen Ansatz der digital-holographischen Aufnahme und Auswertung entwickelt. Mit der Entwicklung spezieller Methoden zur Formung der spektralen Signatur einer Ultrakurzpuls-Laserquelle hoher Leistungsdichte wurde zum ersten Mal die Voraussetzung für eine Zwei-Wellenlängen-Formerfassung hochdynamischer Objekte geschaffen. Die intrinsisch kurze Belichtungszeit unter einer Pikosekunde macht das Verfahren absolut stabil gegenüber Umwelteinflüssen. Für die simultane Aufnahme werden die spektral verschiedenen Hologramme mit einem eigens entwickelten Prinzip der Polarisationskodierung räumlich getrennt und zum ersten Mal mit zwei synchron laufenden Kameras gespeichert. Mit den in der digital-holographischen Einzelimpuls-Mehr-Wellenlängen-Interferometrie zusammengefassten numerischen Routinen zur Rekonstruktion und Phasenauswertung wird eine Zwei-Wellenlängen-Formerfassung mit mehreren Kameras möglich. In Anwendung des neuartigen Verfahrens an verschiedenen dynamischen Mikrosystemen konnte eine Genauigkeit von einem Zwanzigstel der erzeugten synthetischen Wellenlänge, bei der Auswertung der spektralen Differenzphase an Objekten in Reflexion erreicht werden. In einer digital-holographischen Hochgeschwindigkeitsformerfassung in Transmission wurden erstmals Bildfolgefrequenz von mehr als 0,4 kHz erreicht und der interferometrische Eindeutigkeitsbereich auf mehr als das 60-fache der optischen Wellenlänge ausgedehnt. Es wurden die Voraussetzungen für eine digitale Vier-Wellenlängen-Holographie geschaffen. Zukünftig wird eine Formerfassung mit einer Genauigkeit von 10nm über einen eindeutigen interferometrischen Bereich einiger 10 μm und die Untersuchungen von Prozessen auf einer Pikosekunden-Zeitskala möglich sein. / This work deals with the digital holographic interferometric two-wavelength contouring of very fast moving objects and develops with the digital single pulse multiwavelength holography a novel approach of digital holographic recording and analysis, using an ultrashort pulse laser source. The development of several methods to shape the spektral signature of an high power ultrashort pulse laser source provides the precondition for a two-wavelength contouring of highly dynamic objects for the first time. The intrinsically short exposure time shorter than a picosecond makes the system stable regarding external impacts. For the simultaneous recording the spektral different holograms are spatially separated in novel interferometric setups by the especially developed principle of polarization encoding and stored with two synchronized cameras for the first time. The digital holographic single pulse multi-wavelength interferometry combines the numeric routines of reconstruction and phase evaluation that make a two-wavelength contouring possible using more than one camera. The novel approach is successfully demonstrated on several dynamic microsystems. Evaluating the spectral phase difference for objects in reflection an accuracy of 2 μm, which corresponds to the twentieth of the realized synthetic wavelength, could be achieved. In a digital holographic high speed contouring in transmission a frame rate higher than 0,4 kHz was achieved for the first time and the interferometric range of unambiguity was extended larger than sixty times the optical wavelength. Furthermore, the developed digital holographic single pulse multi-wavelength interferometry is not limited to the evaluation of two wavelength. The principles of the method allow to perform digital four-wavelength holography. Future a contouring with an accuracy of 10nm over the unambiguous interferometric range of several 10 μm and the investigation of processes on a picosecond time scale will be possible.
116

Knowledge-Based Multidisciplinary Sizing and Optimization of Embedded Mechatronic Systems - Application to Aerospace Electro-Mechanical Actuation Systems / Aide à l'intégration des savoirs métiers pour le dimensionnement et l'optimisation multidisciplinaires de systèmes mécatroniques embarqués - Application aux systèmes d'actionnement aéronautiques à technologie électromécanique

Delbecq, Scott 29 November 2018 (has links)
Un défi à court terme pour les industriels de l’aéronautique est de concevoir des produits sûrs, fiables, compactes, basse consommation et à faible impact environnemental due à la forte concurrence et à l’augmentation des attentes des clients et des autorités de certification. Un défi à plus long terme pour ces organisations est de pérenniser leur savoir-faire et leur expertise qui sont menacés par le départ en retraite de générations d’experts, ingénieurs et techniciens. Relever ces défis n’est pas une tâche facile lorsque les produits concernés sont des systèmes mécatroniques embarqués tel que les systèmes d’actionnement électromécaniques. La conception de ces systèmes complexes nécessite l’intégration de savoirs très hétérogènes dû à l’interaction entre de nombreux métiers de l’ingénierie et entre les différentes lois de la physique qui les caractérisent. De plus, les systèmes mécatroniques embarqués sont constitués de nombreux composants interdépendants. Faire face à l’interdépendance des composants reste une tâche non-triviale et fondamentale du métier d’ingénieur. Ceci provoque des itérations coûteuses durant le cycle de conception et des solutions non-optimisées. Les techniques d’optimisation multidisciplinaire fournissent des fondements théoriques et des outils de calculs permettant l’optimisation de systèmes comportant un grand nombre de variables et des couplages multidisciplinaires. Dans le but d’utiliser ces techniques pour un dimensionnement rapide des produits mécatroniques, des tâches doivent être effectuées : représentation du savoir de conception, décomposition et coordination des modèles pour l’évaluation et l’optimisation des performances du système. Les modèles algébriques ont été choisis pour représenter les différents modèles de conception. Une nouvelle formulation d’optimisation multidisciplinaire est proposée. Elle permet des convergences rapides et s’avère robuste au changement d’échelle. Une approche basée sur la théorie des graphes et le calcul symbolique est proposée pour aider les ingénieurs à la mise en place de problèmes à grand nombre de variables et comportant des couplages multidisciplinaires. Une méthodologie de dimensionnement est présentée ainsi que l’outil logiciel associé. L’objectif principal est de permettre un dimensionnement global des systèmes mécatroniques en se souciant de la réutilisation du savoir et la prise de décision rapide. La méthodologie est illustrée sur un cas académique de système d’actionnement. Ensuite, des systèmes plus complexes sont étudiés. Tout d’abord, la conception d’un système d’actionnement de commandes de vol primaire est effectuée. Enfin, un système d’actionnement d’inverseur de poussée électrique est dimensionné / The critical short term challenge for contemporary aerospace industrial companies is to design safe, reliable, compact, low power consumption and low environmental impact products, forces driven by economic competition and the increasing expectations of customers and certification authorities. A long-term challenge for these organizations is to manage their knowledge and expertise heritage, which is jeopardized due to forthcoming retirement of the current generation of experts, engineers and technicians. Undertaking these challenges is particularly intricate when it comes to embedded mechatronic systems used in electro-mechanical actuation systems. The design of these complex systems involves heterogeneous knowledge due to the interface of multiple engineering specializations and the interacting physical laws that govern their behaviour. Additionally, embedded mechatronic systems are composed of several interdependent components and sub-systems. Dealing with interdependencies remains a non-trivial and fundamental aspect of modern engineering practice. This can result in costly iterations during the design process and final non-optimal solutions. Multidisciplinary System Design Optimization techniques provide theoretical foundations and computational tools for optimizing large and multidisciplinary systems. Tasks must be performed to apply such techniques for rapid initial sizing of mechatronic products: modelling the design knowledge, partitioning and coordinating the models for system performances analysis and optimization. Algebraic analysis functions are chosen to represent the design models. A new Multidisciplinary System Design Optimization formulation for fast and robust analysis is proposed. A theoretic graph approach using symbolic manipulation to assist designers in formulating large and multidisciplinary problems is outlined. A specific design methodology and its associated framework developed are presented. The general objective is to allow holistic sizing of mechatronic engineering systems with emphasis placed on model reusability and rapid decision making. The methodology is illustrated using a simple aerospace actuation system example. More complex actuation systems are then addressed. First, the design of an electro-mechanical primary flight control actuation system is examined, subsequently; the design methodology is applied to an electrical thrust reverser actuation system.
117

A Selective Encapsulation Solution For Packaging An Optical Micro Electro Mechanical System

Bowman, Amy Catherine 08 January 2002 (has links)
This work developed a process to provide physical, electrical, and environmental protection to the electrical lead system of an optical switch device. A literature review was conducted to find materials and processes suitable to the stress-sensitive, high voltage characteristics of many optical switch devices. An automatic dispensing dam and fill process, and three candidate materials (two epoxy and one silicone) were selected for investigation. Experimental and analytical techniques were used to evaluate the materials. Methods applied included interferometric die warpage measurements, electrochemical migration resistance tests (ECMT), thermal cycling, and finite element analysis. The silicone dam and fill system was selected based upon the results of die warpage and electrochemical migration resistance tests. A modified, selective dam and fill process was developed and preliminary reliability testing was performed. The paper provides detailed instructions for successful encapsulation of the optical switch's lead system.
118

ASSESSING THE SPATIAL ACCURACY AND PRECISION OF LIDAR FOR REMOTE SENSING IN AGRICULTURE

Dasika, Surya Saket 01 January 2018 (has links)
The objective of this whole study was to evaluate a LiDAR sensor for high-resolution remote sensing in agriculture. A linear motion system was developed to precisely control the dynamics of LiDAR sensor in effort to remove uncertainty in the LiDAR position/velocity while under motion. A user control interface was developed to operate the system under different velocity profiles and log LiDAR data synchronous to the motion of the system. The LiDAR was then validated using multiple test targets with five different velocity profiles to determine the effect of sensor velocity and height above a target on measurement error. The results indicated that the velocity of the LiDAR was a significant factor affecting the error and standard deviation of the LiDAR measurements, although only by a small margin. Then the concept of modeling the alfalfa using the linear motion system was introduced. Two plots of alfalfa were scanned and processed to extract height and volume and was compared with photogrammetric and field measurements. Insufficient alfalfa plots were scanned which prevented any statistical analysis from being used to compare the different methods. However, the comparison between LiDAR and photogrammetric data showed some promising results which may be further replicated in the future.
119

Development and Thermodynamic Analysis of an Integrated Mild/Partial Gasification Combined Cycle (IMPGC) Under Green and Brown Field Conditions With and Without Carbon Capture

Long, Henry A, III 20 December 2018 (has links)
Coal is a very prominent energy source in the world, but it is environmentally unattractive due to its high sulfur and ash content as well as its alleged contribution towards climate change, but it is affordable, abundant, and has high energy content. Thus, utilizing coal in a cleaner and more efficient way has become necessary. One promising clean coal technology involves fully gasifying coal into synthesis gas, cleaning it, and feeding it into a high-efficiency combined cycle, such as an Integrated Gasification Combined Cycle (IGCC). Inspired by the recent success of warn gas cleanup (WGCU), mild and partial gasification are proposed as less energy intensive options. This Integrated Mild/Partial Gasification Combined Cycle (IMPGC) could significantly save energy and improve efficiency. The objective of this study is to investigate the capabilities of IMPGC as both a new plant and a retrofit option for traditional coal power plants with and without carbon capture. I MPGC relies on the principles of mild and partial gasification and the recently available WGGU technology with the following benefits: a.) completely negate the need for syngas cooling; b.) significantly reduce the energy needed to fully thermally crack the volatiles and completely gasify the char as in the IGCC system; c.) preserve the high chemical energy hydro-carbon bonds within the feedstock to allow more efficient combustion in the gas turbine; d.) reduce the size of gasifier and piping to reduce the costs; and e.) enable retrofitting of an old coal power plant by preserving the existing equipment. The software used (Thermoflex®) was first validated with established cases from the U.S. Department of Energy. For new plants, the results show that IMPGC’s efficiency is 8 percentage points (20%) higher than IGCC, 8 points higher than a modern subcritical Rankine cycle, and 3-4 points higher than an ultra-supercritical (USC) cycle. When retrofitting older plants, a minimum improvement of over 4 points is predicted. When carbon capture is involved, IMPGC’s efficiency becomes 10 points better than a subcritical plant and 8 points better than a USC plant. Emissions wise, IMPGC is better than IGCC and much better than Rankine cycle plants.
120

Biocapteurs implantables pour un monitorage intracérébral minimalement invasif / Implantable biosensors for minimally invasive intracerebral monitoring

Chatard, Charles 23 November 2018 (has links)
Le fonctionnement du cerveau repose sur la libération de molécules telles que les neurotransmetteurs et les métabolites dans le milieu interstitiel. L’étude de ces molécules est donc primordiale afin de mieux comprendre leur rôle physiologique et pathologique. Pour cela, les biocapteurs enzymatiques implantables sont un outil prometteur de par leurs capacités de détection quantitative, en temps réel et dans les tissus profonds. En fonction des dimensions du biocapteur, l’impact de l’implantation peut avoir des conséquences considérables sur la composition chimique du fluide interstitiel. De plus, chaque implantation induit localement une réaction inflammatoire dite « réaction au corps étranger ». La réduction de ces réactions est indispensable afin d’obtenir des estimations plus précises de la concentration des molécules présentes. Dans ce sens, ce manuscrit exposera deux voies de réduction de l’impact lésionnel dû à l’implantation de biocapteurs. Tout d’abord, il sera présenté la miniaturisation de biocapteurs enzymatiques jusqu’à des diamètres externes inférieurs à 15 µm. Et il sera démontré in vivo que ces biocapteurs ultra miniaturisés ont le potentiel d’être implantés dans le cerveau sans induire de dommages détectables aux tissus et aux vaisseaux sanguins. Ensuite, le développement d’une microsonde fabriquée à l’aide des technologies MEMS couplant une détection électrochimique et optique sera introduit dans le cadre du suivi du fluide interstitiel péri- et intratumoral de glioblastomes modifiés pour émettre de la fluorescence. En intégrant deux types de détection sur une unique micro-aiguille, cette microsonde permet de réduire le nombre d’implantations. Ces deux voies de miniaturisation ouvrent la possibilité de suivre la composition chimique du fluide interstitiel de manière moins invasive, et donc de mieux préserver la physiologie des tissus étudiés dans le cerveau. / Brain function is based on the release of molecules such as neurotransmitters and metabolites into the interstitial fluid. The study of these molecules is essential to better understand their physiological and pathological role. For this purpose, implantable enzymatic biosensors are a promising tool because of their quantitative, real-time and deep tissue detection abilities. Depending on the dimensions of the biosensor, the impact of implantation may have considerable consequences on the chemical composition of the interstitial fluid. In addition, each implantation induces a local inflammatory reaction called "foreign body reaction". The reduction of these reactions is crucial in order to provide more accurate estimations of molecules concentrations present in the interstitial fluid. In this sense, this manuscript will expose two ways of reducing the lesional impact due to the implantation of biosensors. First of all, it will be presented the miniaturization of enzymatic biosensors up to external diameters less than 15 µm. And it will be demonstrated in vivo that these ultra miniaturized biosensors have the potential to be implanted in the brain without inducing detectable damage to tissues and blood vessels. Then, the development of a microprobe fabricated using MEMS technologies combining electrochemical and optical detection will be introduced as part of the monitoring of peri- and intratumoral interstitial fluid from glioblastomes modified to fluoresce. By integrating two types of detection on a single micro-needle, this microprobe reduces the number of implantations. These two miniaturization approaches open up the possibility of following the chemical composition of the interstitial fluid in a less invasive way, and thus of better preserving the physiology of the tissues studied in the brain.

Page generated in 0.0855 seconds