• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 196
  • 53
  • 21
  • 19
  • 8
  • 7
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 378
  • 378
  • 96
  • 67
  • 66
  • 64
  • 58
  • 51
  • 50
  • 38
  • 37
  • 37
  • 34
  • 34
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
311

Multi-Sensor Data Synchronization using Mobile Phones

Wåhslén, Jonas January 2013 (has links)
Body sensor networking is a rapidly growing technology. Today wearable sensors are used to measure and monitor e.g. pulse, temperature, skin conductance, heart activity, and movement (through GPS or inertial measurement units). Mobile phones can act as coordinating nodes in wireless personal area networks used in home automation, healthcare, sport and wellness e.g. to measure pulse and distance. Integration of data from multiple sources sensors (data fusion) means that data from each sensor node needs to be associated with data from other sensor nodes sampled at approximately the same time. Accurate methods for time synchronization are therefore a necessary prerequisite for reliable data fusion. This thesis studies time synchronization problems in Bluetooth piconets between multiple wireless sensor nodes connected to a mobile phone that acts as coordinating node. Three different algorithms to enable correct data fusion have been developed, implemented and evaluated. The first is a single clock solution that synchronizes multiple wireless sensor nodes based solely on the mobile phone’s clock. The other two algorithms synchronize the clocks in sensor nodes to the clock in the coordinating node. / <p>QC 20130605</p>
312

Genauigkeitsuntersuchung von inertialen Messsensoren aus dem Niedrigpreissegment unter Nutzung verschiedener Auswertestrategien

Döhne, Thorben 20 August 2019 (has links)
Für viele Anwendungen auf bewegten Plattformen wird eine genaue Information zur Orientierung der Plattform benötigt. Zur Bestimmung der Lagewinkel werden dabei inertiale Messsensoren verwendet, welche zu einer inertialen Messeinheit (Inertial Measurement Unit, IMU) zusammengefasst werden. In dieser Arbeit werden vier IMUs aus dem Niedrigpreissegment auf die zu erhaltene Genauigkeit der Lagewinkel untersucht. Die untersuchten IMUs sind dabei als Mikrosysteme (Microelectromechanical systems) gefertigt, was neben den Vorteilen eines geringen Preises, eines geringen Gewichts und eines geringen Energieverbrauchs allerdings auch den Nachteil einer schlechteren Genauigkeit gegenüber klassischen IMUs hat. In dieser Arbeit wird die Genauigkeitsuntersuchung anhand eines Datensatzes einer Flugkampagne durchgeführt, für welche auch eine Referenzlösung vorliegt. Die Messungen der IMUs werden über ein Erweitertes Kalman-Filter mit einer genauen GNSS- (Global Navigation Satellite System) Lösung gestützt. Neben der Navigationslösung werden dabei auch die Fehler der Sensoren mitgeschätzt. Aufgrund von zu großen Fehlern der Startwerte kommt es bei einigen Schätzungen teilweise zur Divergenz. Zur Lösung dieses Problems wird eine iterative Auswertung angewendet, wodurch eine stabile Lösung möglich ist. Eine weitere Verbesserung wird über eine Glättung erzielt. Einzelne, kleine Fehler in der Zeitstempelung, welche sich stark auf die Genauigkeit der Lösung auswirken, werden über eine Interpolation der Daten auf Zeitstempel in regelmäßigen Abständen ausgeglichen. Damit können für zwei der vier untersuchten IMUs auf den Fluglinien Genauigkeiten der Roll-, Pitch- und Yaw-Winkel von 0,05°, 0,10° und 0,20° erreicht werden. Die Genauigkeiten der zwei weiteren IMUs fallen teilweise erheblich schlechter aus, was auf die ungenaue Zeitstempelung bei der Datenaufnahme zurückgeführt wird. Für die Anwendung von Laserscanning auf bewegten Plattformen wird in einer Genauigkeitsabschätzung gezeigt, dass Genauigkeiten der Höhenkomponente von besser als 1 dm mit den erhaltenen Lagewinkelgenauigkeiten der beiden besseren IMUs möglich sind.
313

Data Augmentation and Enhancement for Cardiovascular 4D Flow MRI

Jiacheng Zhang (12455544) 25 April 2022 (has links)
<p>    </p> <p>Cerebral aneurysms are presented in 3-5% of the population and account for approximately 10% of all strokes. The clinical decision on treating unruptured aneurysms should not be taken lightly because a majority of the asymptomatic cerebral aneurysm will not rupture, while both endovascular and microsurgical treatments carry the risk of morbidity and mortality. Thus, there is a need for objective risk assessment to reliably predict the high-risk aneurysms to intervene. Recent studies have found that the blood flow hemodynamic metrics such as pressure and wall shear stress (WSS) are related to the growth and rupture of the aneurysms. 4D flow magnetic resonance imaging (MRI) measures time-resolved three-dimensional velocity fields in the aneurysms <em>in vivo</em>, allowing for the evaluation of hemodynamic parameters. This work presents the developments of flow-physics constrained data enhancement and augmentation methods for 4D flow MRI to assist the risk stratification of cerebral aneurysms. First, a phase unwrapping and denoising method is introduced to enhance the dynamic range and accuracy of 4D flow MRI velocity measurement by incorporating the divergence-free constraint of incompressible flow. Moreover, methods are developed to improve the estimation of hemodynamic parameters from 4D flow data including pressure and WSS. The pressure reconstruction method is also applied to the flow data acquired using particle imaging velocimetry (PIV) and particle tracking velocimetry (PTV) and shows superior performance as compared to the existing methods by solving the pressure Poisson equation. We also proposed a framework to estimate the uncertainty of the PIV/PTV based pressure estimation by propagating the velocity uncertainty. In addition, a multi-modality approach is introduced to enhances the resolution and accuracy of 4D flow data with sparse representation, which improves the reliability of the hemodynamic evaluation. Finally, we present a method to measure the left ventricular flow propagation velocity from cardiac imaging to help in assessing the diastolic function. </p>
314

Étude et conception d’un système de télésurveillance et de détection de situations critiques "par suivi actimétrique" des personnes à risques en milieu indoor et outdoor / Study and design a system for monitoring and detecting critical situations by monitoring actimetric parameters of people at risk indoor and outdoor

Bourennane, Walid 25 September 2013 (has links)
Le vieillissement rapide de la population implique une évolution du système de soins pour prendre en charge les personnes dépendantes dont la proportion ne cesse de croître. Une option possible est de développer et de mettre en œuvre une technologie d’assistance à domicile. Ce travail de thèse consiste à concevoir et à expérimenter des solutions de « surveillance » multicapteurs : déploiement de capteurs dans l’environnement réel du patient, fusion multisensorielle et algorithmes de diagnostics automatiques, afin d’assurer la sécurité des personnes mais également d’aider les professionnels de santé à maintenir la qualité du suivi et des soins. Ce manuscrit présente à travers une étude bibliographique, un état des connaissances et des pratiques sur les systèmes d’analyse des activités des personnes âgées. Ensuite, il détaille l’étape de conception du système de surveillance retenu et selon une approche fonctionnelle présente l’architecture matérielle et logicielle mise en œuvre pour répondre aux spécifications établies. Deux projets sont issus de ce travail : 1) Le projet Homecare qui vise à expérimenter et à qualifier, au niveau opérationnel, un système complet de Télésurveillance pour les personnes âgées atteintes de la maladie d'Alzheimer. 2) Le projet BéA qui est plutôt orienté sur la surveillance « outdoor » pour des personnes valides mais fragiles : Notre contribution a concerné la mise en place d’une architecture système qui intègre un algorithme auto-adaptatif de détection et de modélisation des déambulations par tranche horaire. Enfin, les pistes d’un modèle économique qui définit les options offertes, aujourd’hui, au déploiement de ce type de systèmes sont discutées. / The aging of the population implies an evolution of the healthcare systems, in order to take over dependent people whose proportion is growing. A possible solution is to develop and implement a home care technology. In this context, this PhD work consist to design and test a multisensory monitoring system: deployment of sensors in patient environment, sensory data fusion and automatic diagnostic algorithms, in order allowing patients to live in better safety conditions and helping health professionals to give better care quality. This manuscript presents, in the beginning, the problematic of elderly care and the state of knowledge and practices of activity analysis systems for elderly, through a literature review. Then, it describes the design step of monitoring system according to a functional approach. This approach presents the implemented hardware and software architecture in order to satisfy established specifications. We worked on two projects: 1) Homecare System aims to experiment and validate the good working of a remote monitoring system for elderly (or disabled) suffering from Alzheimer's disease. 2) BéA System (Bracelet Electronique d’Autonomie) is oriented to outdoor monitoring of able-bodied people but frail: Our contribution is to implement a system architecture which integrates a self-adaptive algorithm for detecting and modelling wandering by day-part. For these two projects, we present the results and make a first experience feedback. Finally, we propose a potential economic model that defines the available options that allow, today, the deployment these types of system.
315

Data Fusion for the Problem of Protein Sidechain Assignment

Lei, Yang 01 January 2010 (has links) (PDF)
In this thesis, we study the problem of protein side chain assignment (SCA) given multiple sources of experimental and modeling data. In particular, the mechanism of X-ray crystallography (X-ray) is re-examined using Fourier analysis, and a novel probabilistic model of X-ray is proposed for SCA's decision making. The relationship between the measurements in X-ray and the desired structure is reformulated in terms of Discrete Fourier Transform (DFT). The decision making is performed by developing a new resolution-dependent electron density map (EDM) model and applying Maximum Likelihood (ML) estimation, which simply reduces to the Least Squares (LS) solution. Calculation of the condence probability associated with this decision making is also given. One possible extension of this novel model is the real-space refinement when the continuous conformational space is used. Furthermore, we present a data fusion scheme combining multi-sources of data to solve SCA problem. The merit of our framework is the capability of exploiting multi-sources of information to make decisions in a probabilistic perspective based on Bayesian inference. Although our approach aims at SCA problem, it can be easily transplanted to solving for the entire protein structure.
316

Building Information Modeling Connection Recommendation Based on Machine Learning Using Multimodal Information / Byggnadsinformationsmodellering Kopplingsrekommendation baserad på maskininlärning med användning av multimodal information

Zhou, Zixin January 2023 (has links)
Den ökande komplexiteten i byggprojekt ger upphov till behovet av ett effektivt sätt att designa, hantera och underhålla strukturer. Byggnadsinformationsmodellering (BIM) underlättar dessa processer genom att tillhandahålla en digital representation av fysiska strukturer. Tekla Structures (TS) har blivit en populär programvara för byggnadsinformationsmodellering inom konstruktionsdesign. I konstruktionsingenjörskap spelar kopplingar en viktig roll i att förena olika byggnadsobjekt. Trots det återstår utmaningen att effektivt och noggrant designa kopplingar i TS på grund av det breda spektrumet av tillgängliga kopplingstyper. Befintliga lösningar för rekommendation av anslutningar förlitar sig ofta på fördefinierade regler, vilket begränsar deras tillämplighet och kräver tidskrävande installation. Nylig forskning har undersökt maskininlärningsmetoder för rekommendation av anslutningar, men de lider av skalbarhetsproblem eller hög beräkningskostnad. Denna avhandling behandlar problemet med rekommendation av anslutningstyp i Tekla Structures som en klassificeringsuppgift, genom att dra nytta av de olika representationerna av BIM-objekt, inklusive 2D-bilder och attribut. Avhandlingen förbättrar befintliga metoder för enskilda datakällor genom att jämföra XGBoost med random forest för attribut, samtidigt som den förbättrar den tidigare CNN-modellen för bildklassificering. Dessutom undersöker detta projekt potentialen att kombinera bilder och attributdata för klassificering av anslutningstyper, genom att använda två multimodala strategier för datafusion: sen fusion och intermediär fusion. Resultaten visar att XGBoost med metadata från attributdatamängden ger bästa prestanda, med en maximal noggrannhet på 0.9283, och de experimentella multimodala datametoderna kan inte ytterligare optimera klassificeringsresultaten. Noggrannheten för attributbaserade metoder förbättras med upp till 0.6%. Förbättringen i CNN-modellen kan öka klassificeringsnoggrannheten med upp till 5%. Genom att jämföra olika datakällor och tillvägagångssätt syftar denna avhandling till att ge en praktisk rekommendation för anslutningsdesign och därigenom lägga grunden för bättre anslutningsdesignprocesser inom byggprojekt. / The increasing complexity of construction projects gives rise to the need for an efficient way of designing, managing, and maintaining structures. Building Information Modeling (BIM) facilitates these processes by providing a digital representation of physical structures. Tekla Structures (TS) has emerged as a popular building information modeling software for structural design. In structural engineering, connections play an important role in joining various building objects. However, the efficient and accurate design of connections in TS remains a challenge due to the wide range of available connection types. Existing solutions for connection recommendation often rely on predefined rules, limiting their applicability and requiring time-consuming setup. Recent research has explored machine learning approaches for connection recommendation, but they suffer from scalability issues or high computational costs. This thesis addresses the connection type recommendation problem in TS as a classification task, leveraging the diverse representations of the BIM objects, including 2D images and attributes. This thesis improves existing approaches for single modality data, comparing XGBoost with random forest for attributes, while enhancing the previous CNN model for image classification. Furthermore, this thesis investigates the potential of combining images and attribute data for connection type classification, using two multimodal data fusion strategies: late fusion and intermediate fusion. The results show that XGBoost with metadata of the attribute dataset yields the best performance, with a maximum accuracy of 0.9283, and the experimented multimodal data fusion methods are unable to further optimise the classification results. The accuracy of attribute-based methods is improved by up to 0.6%. The improvement in CNN model can enhance the classification accuracy by up to 5%. By comparing various data sources and approaches, this thesis aims to provide a practical connection recommendation design, thereby laying a foundation for better connection design processes in construction projects.
317

Multisensor Microwave Remote Sensing in the Cryosphere

Remund, Quinn P. 14 May 2003 (has links) (PDF)
Because the earth's cryosphere influences global weather patterns and climate, the scientific community has had great interest in monitoring this important region. Microwave remote sensing has proven to be a useful tool in estimating sea and glacial ice surface characteristics with both scatterometers and radiometers exhibiting high sensitivity to important ice properties. This dissertation presents an array of studies focused on extracting key surface features from multisensor microwave data sets. First, several enhanced resolution image reconstruction issues are addressed. Among these are the optimization of the scatterometer image reconstruction (SIR) algorithm for NASA scatterometer (NSCAT) data, an analysis of Ku-band azimuthal modulation in Antarctica, and inter-sensor European Remote Sensing Satellite (ERS) calibration. Next, various methods for the removal of atmospheric distortions in image reconstruction of passive radiometer observations are considered. An automated algorithm is proposed which determines the spatial extent of sea ice in the Arctic and Antarctic regions from NSCAT data. A multisensor iterative sea ice statistical classification method which adapts to the temporally varying signatures of ice types is developed. The sea ice extent and classification algorithms are adopted for current SeaWinds scatterometer data sets. Finally, the automated inversion of large-scale forward electromagnetic scattering of models is considered and used to study the temporal evolution of the scattering properties of polar sea ice.
318

Investigation of LIBS and Raman data analysis methods in the context of in-situ planetary exploration

Rammelkamp, Kristin 05 December 2019 (has links)
Die in dieser Arbeit vorgestellten Studien untersuchen verschiedene Ansätze für die Analyse von spektroskopischen Daten für die Erforschung anderer Himmelskörper. Der Fokus lag hierbei auf der laserinduzierten Plasmaspektroskopie (LIBS, engl. laser-induced breakdown spectroscopy), aber auch Daten der Raman-Spektroskopie wurden analysiert. Das erste extraterrestrisch eingesetzte LIBS Instrument, ChemCam, auf dem Mars Science Laboratory (MSL) der NASA untersucht die Marsoberfläche seit 2012 und weitere Missionen mit LIBS und Raman Instrumenten zum Mars sind geplant. Neben analytischen Ansätzen wurden statistische Methoden, die als multivariate Datenanalysen (MVA) bekannt sind, verwendet und evaluiert. In dieser Arbeit werden insgesamt vier Studien vorgestellt. In der ersten Studie wurde die Normalisierung von LIBS Daten mit Plasmaparametern, also der Plasmatemperatur und der Elektronendichte, untersucht. In der zweiten Studie wurden LIBS Messungen unter Vakuumbedingungen im Hinblick auf den Ionisierungsgrad des Plasmas untersucht. In der dritten Studie wurden MVA Methoden wie die Hauptkomponentenanalyse (PCA) und die partielle Regression kleinster Quadrate (PLS-R) zur Identifizierung und Quantifizierung von Halogenen mittels molekularer Emissionen angewandt. Die Ergebnisse sind vielversprechend, da es möglich war z.B. Chlor in einem ausgewählten Konzentrationsbereich zu quantifizieren. In der letzten Studie wurden LIBS-Daten mit komplementären Raman-Daten von Mars relevanten Salzen in einem low-level Datenfusionsansatz kombiniert. Es wurden MVA Methoden angewandt und auch Konzepte der high-level Datenfusion implementiert. Mit der low-level LIBS und Raman Datenfusion konnten im Vergleich zu den einzelnen Techniken mehr Salze richtig identifiziert werden. Der Gewinn durch die low-level Datenfusion ist jedoch vergleichsweise gering und für konkrete Missionen müssen individuelle und angepasste Strategien für die gemeinsame Analyse von LIBS und Raman-Daten gefunden werden. / The studies presented in this thesis investigate different data analysis approaches for mainly laser-induced breakdown spectroscopy (LIBS) and also Raman data in the context of planetary in-situ exploration. Most studies were motivated by Mars exploration due to the first extraterrestrially employed LIBS instrument ChemCam on NASA's Mars Science Laboratory (MSL) and further planned LIBS and Raman instruments on upcoming missions to Mars. Next to analytical approaches, statistical methods known as multivariate data analysis (MVA) were applied and evaluated. In this thesis, four studies are presented in which LIBS and Raman data analysis strategies are evaluated. In the first study, LIBS data normalization with plasma parameters, namely the plasma temperature and the electron density, was studied. In the second study, LIBS measurements in vacuum conditions were investigated with a focus on the degree of ionization of the LIBS plasma. In the third study, the capability of MVA methods such as principal component analysis (PCA) and partial least squares regression (PLS-R) for the identification and quantification of halogens by means of molecular emissions was tested. The outcomes are promising, as it was possible to distinguish apatites and to quantify chlorine in a particular concentration range. In the fourth and last study, LIBS data was combined with complementary Raman data in a low-level data fusion approach using MVA methods. Also, concepts of high-level data fusion were implemented. Low-level LIBS and Raman data fusion can improve identification capabilities in comparison to the single datasets. However, the improvement is comparatively small regarding the higher amount of information in the low-level fused data and dedicated strategies for the joint analysis of LIBS and Raman data have to be found for particular scientific objectives.
319

Integrated Data Fusion And Mining (idfm) Technique For Monitoring Water Quality In Large And Small Lakes

Vannah, Benjamin 01 January 2013 (has links)
Monitoring water quality on a near-real-time basis to address water resources management and public health concerns in coupled natural systems and the built environment is by no means an easy task. Furthermore, this emerging societal challenge will continue to grow, due to the ever-increasing anthropogenic impacts upon surface waters. For example, urban growth and agricultural operations have led to an influx of nutrients into surface waters stimulating harmful algal bloom formation, and stormwater runoff from urban areas contributes to the accumulation of total organic carbon (TOC) in surface waters. TOC in surface waters is a known precursor of disinfection byproducts in drinking water treatment, and microcystin is a potent hepatotoxin produced by the bacteria Microcystis, which can form expansive algal blooms in eutrophied lakes. Due to the ecological impacts and human health hazards posed by TOC and microcystin, it is imperative that municipal decision makers and water treatment plant operators are equipped with a rapid and economical means to track and measure these substances. Remote sensing is an emergent solution for monitoring and measuring changes to the earth’s environment. This technology allows for large regions anywhere on the globe to be observed on a frequent basis. This study demonstrates the prototype of a near-real-time early warning system using Integrated Data Fusion and Mining (IDFM) techniques with the aid of both multispectral (Landsat and MODIS) and hyperspectral (MERIS) satellite sensors to determine spatiotemporal distributions of TOC and microcystin. Landsat satellite imageries have high spatial resolution, but such application suffers from a long overpass interval of 16 days. On the other hand, free coarse resolution sensors with daily revisit times, such as MODIS, are incapable of providing detailed water quality information because of low spatial resolution. This iv issue can be resolved by using data or sensor fusion techniques, an instrumental part of IDFM, in which the high spatial resolution of Landsat and the high temporal resolution of MODIS imageries are fused and analyzed by a suite of regression models to optimally produce synthetic images with both high spatial and temporal resolutions. The same techniques are applied to the hyperspectral sensor MERIS with the aid of the MODIS ocean color bands to generate fused images with enhanced spatial, temporal, and spectral properties. The performance of the data mining models derived using fused hyperspectral and fused multispectral data are quantified using four statistical indices. The second task compared traditional two-band models against more powerful data mining models for TOC and microcystin prediction. The use of IDFM is illustrated for monitoring microcystin concentrations in Lake Erie (large lake), and it is applied for TOC monitoring in Harsha Lake (small lake). Analysis confirmed that data mining methods excelled beyond two-band models at accurately estimating TOC and microcystin concentrations in lakes, and the more detailed spectral reflectance data offered by hyperspectral sensors produced a noticeable increase in accuracy for the retrieval of water quality parameters.
320

A basic probability assignment methodology for unsupervised wireless intrusion detection

Ghafir, Ibrahim, Kyriakopoulos, K.G., Aparicio-Navarro, F.J., Lambotharan, S., Assadhan, B., Binsalleeh, A.H. 24 January 2020 (has links)
Yes / The broadcast nature of wireless local area networks has made them prone to several types of wireless injection attacks, such as Man-in-the-Middle (MitM) at the physical layer, deauthentication, and rogue access point attacks. The implementation of novel intrusion detection systems (IDSs) is fundamental to provide stronger protection against these wireless injection attacks. Since most attacks manifest themselves through different metrics, current IDSs should leverage a cross-layer approach to help toward improving the detection accuracy. The data fusion technique based on the Dempster–Shafer (D-S) theory has been proven to be an efficient technique to implement the cross-layer metric approach. However, the dynamic generation of the basic probability assignment (BPA) values used by D-S is still an open research problem. In this paper, we propose a novel unsupervised methodology to dynamically generate the BPA values, based on both the Gaussian and exponential probability density functions, the categorical probability mass function, and the local reachability density. Then, D-S is used to fuse the BPA values to classify whether the Wi-Fi frame is normal (i.e., non-malicious) or malicious. The proposed methodology provides 100% true positive rate (TPR) and 4.23% false positive rate (FPR) for the MitM attack and 100% TPR and 2.44% FPR for the deauthentication attack, which confirm the efficiency of the dynamic BPA generation methodology. / Gulf Science, Innovation and Knowledge Economy Programme of the U.K. Government under UK-Gulf Institutional Link Grant IL 279339985 and in part by the Engineering and Physical Sciences Research Council (EPSRC), U.K., under Grant EP/R006385/1.

Page generated in 0.0951 seconds