• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 45
  • 11
  • 9
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 94
  • 94
  • 23
  • 21
  • 19
  • 15
  • 14
  • 14
  • 14
  • 13
  • 12
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Sensor Networks: Studies on the Variance of Estimation, Improving Event/Anomaly Detection, and Sensor Reduction Techniques Using Probabilistic Models

Chin, Philip Allen 19 July 2012 (has links)
Sensor network performance is governed by the physical placement of sensors and their geometric relationship to the events they measure. To illustrate this, the entirety of this thesis covers the following interconnected subjects: 1) graphical analysis of the variance of the estimation error caused by physical characteristics of an acoustic target source and its geometric location relative to sensor arrays, 2) event/anomaly detection method for time aggregated point sensor data using a parametric Poisson distribution data model, 3) a sensor reduction or placement technique using Bellman optimal estimates of target agent dynamics and probabilistic training data (Goode, Chin, & Roan, 2011), and 4) transforming event monitoring point sensor data into event detection and classification of the direction of travel using a contextual, joint probability, causal relationship, sliding window, and geospatial intelligence (GEOINT) method. / Master of Science
72

Monitoring energy performance in local authority buildings

Stuart, Graeme January 2011 (has links)
Energy management has been an important function of organisations since the oil crisis of the mid 1970’s led to hugely increased costs of energy. Although the financial costs of energy are still important, the growing recognition of the environmental costs of fossil-fuel energy is becoming more important. Legislation is also a key driver. The UK has set an ambitious greenhouse gas (GHG) reduction target of 80% of 1990 levels by 2050 in response to a strong international commitment to reduce GHG emissions globally. This work is concerned with the management of energy consumption in buildings through the analysis of energy consumption data. Buildings are a key source of emissions with a wide range of energy-consuming equipment, such as photocopiers or refrigerators, boilers, air-conditioning plant and lighting, delivering services to the building occupants. Energy wastage can be identified through an understanding of consumption patterns and in particular, of changes in these patterns over time. Changes in consumption patterns may have any number of causes; a fault in heating controls; a boiler or lighting replacement scheme; or a change in working practice entirely unrelated to energy management. Standard data analysis techniques such as degree-day modelling and CUSUM provide a means to measure and monitor consumption patterns. These techniques were designed for use with monthly billing data. Modern energy metering systems automatically generate data at half-hourly or better resolution. Standard techniques are not designed to capture the detailed information contained in this comparatively high-resolution data. The introduction of automated metering also introduces the need for automated analysis. This work assumes that consumption patterns are generally consistent in the short-term but will inevitably change. A novel statistical method is developed which builds automated event detection into a novel consumption modelling algorithm. Understanding these changes to consumption patterns is critical to energy management. Leicester City Council has provided half-hourly data from over 300 buildings covering up to seven years of consumption (a total of nearly 50 million meter readings). Automatic event detection pinpoints and quantifies over 5,000 statistically significant events in the Leicester dataset. It is shown that the total impact of these events is a decrease in overall consumption. Viewing consumption patterns in this way allows for a new, event-oriented approach to energy management where large datasets are automatically and rapidly analysed to produce summary meta-data describing their salient features. These event-oriented meta-data can be used to navigate the raw data event by event and are highly complementary to strategic energy management.
73

Sistema para monitoramento e análise de paisagens acústicas submarinas. / System for monitoring and analysing underwater acoustic landscapes.

Alvarez Rosario, Alexander 14 October 2015 (has links)
O Monitoramento Acústico Passivo (PAM) submarino refere-se ao uso de sistemas de escuta e gravação subaquática, com o intuito de detectar, monitorar e identificar fontes sonoras através das ondas de pressão que elas produzem. Se diz que é passivo já que tais sistemas unicamente ouvem, sem perturbam o meio ambiente acústico existente, diferentemente de ativos, como os sonares. O PAM submarino tem diversas áreas de aplicação, como em sistemas de vigilância militar, seguridade portuária, monitoramento ambiental, desenvolvimento de índices de densidade populacional de espécies, identificação de espécies, etc. Tecnologia nacional nesta área é praticamente inexistente apesar da sua importância. Neste contexto, o presente trabalho visa contribuir com o desenvolvimento de tecnologia nacional no tema através da concepção, construção e operação de equipamento autônomo de PAM e de métodos de processamento de sinais para detecção automatizada de eventos acústicos submarinos. Foi desenvolvido um equipamento, nomeado OceanPod, que possui características como baixo custo de fabrica¸c~ao, flexibilidade e facilidade de configuração e uso, voltado para a pesquisa científica, industrial e para controle ambiental. Vários protótipos desse equipamento foram construídos e utilizados em missões no mar. Essas jornadas de monitoramento permitiram iniciar a criação de um banco de dados acústico, o qual permitiu fornecer a matéria prima para o teste de detectores de eventos acústicos automatizados e em tempo real. Adicionalmente também é proposto um novo método de detecção-identificação de eventos acústicos, baseado em análise estatística da representação tempo-frequência dos sinais acústicos. Este novo método foi testado na detecção de cetáceos, presentes no banco de dados gerado pelas missões de monitoramento. / Passive Acoustic Monitoring (PAM) refers to the use of systems to listen and record underwater soundscape, in order to detect, track and identify sound sources through the pressure waves that they produce. It is said to be passive as these systems only hear, not put noise in the environment, such as sonars. Underwater PAM has various application areas, such as military surveillance systems, port security, environmental monitoring, development of population density rates of species, species identification, etc. National technology in the field is practically nonexistent despite its importance. In this context, this paper aims to contribute to the national technology development in the field by designing, building, and operating a self-contained PAM equipment, also developing signal-processing methods for automated detection of underwater acoustic events. A device, named \"OceanPod\"which has characteristics such as low manufacturing cost, flexibility and ease of setup and use, intended for scientific, industrial research and environmental control was developed. Several prototypes of the equipment were built and used in several missions at seawaters. These missions monitoring, enabled start creating an acoustic database, which provided the raw material for the automated acoustic events detectors and realtime test. Additionally, it is also proposed a new method of detecting, identifying sound events, based on statistical analysis of the time-frequency representation of the acoustic signals. This new method has been tested in the detection of cetaceans present in the database generated by missions monitoring.
74

Distributed Energy-Efficient Solutions for Area Coverage Problems in Wireless Sensor Networks

Vu, Chinh Trung 11 June 2009 (has links)
Wireless sensor networks (WSNs) have recently attracted a great deal of attention due to their numerous attractive applications in many different fields. Sensors and WSNs possess a number of special characteristics that make them very promising in a wide range of applications, but they also put on them lots of constraints that make issues in sensor network particularly challenging. These issues may include topology control, routing, coverage, security, data management and many others. Among them, coverage problem is one of the most fundamental ones for which a WSN has to watch over the environment such as a forest (area coverage) or set of subjects such as collection of precious renaissance paintings (target of point coverage) in order for the network to be able to collect environment parameters, and maybe further monitor the environment. In this dissertation, we highly focus on the area coverage problem. With no assumption of sensors’ locations (i.e., the sensor network is randomly deployed), we only consider distributed and parallel scheduling methods with the ultimate objective of maximizing network lifetime. Additionally, the proposed solutions (including algorithms, a scheme, and a framework) have to be energy-efficient. Generally, we investigate numerous generalizations and variants of the basic coverage problem. Those problems of interest include k-coverage, composite event detection, partial coverage, and coverage for adjustable sensing range network. Various proposed algorithms. In addition, a scheme and a framework are also suggested to solve those problems. The scheme, which is designed for emergency alarming applications, specifies the guidelines for data and communication patterns that significantly reduce the energy consumption and guarantee very low notification delay. For partial coverage problem, we propose a universal framework (consisting of four strategies) which can take almost any complete-coverage algorithm as an input to generate an algorithm for partial coverage. Among the four strategies, two pairs of strategies are trade-off in terms of network lifetime and coverage uniformity. Extensive simulations are conducted to validate the efficiency of each of our proposed solutions.
75

Τεχνικές επεξεργασίας ψηφιακού σεισμικού σήματος για χρήση στην τομογραφία υψηλής ανάλυσης

Λόης, Αθανάσιος 16 May 2014 (has links)
Αντικείμενο της παρούσας διδακτορικής διατριβής αποτελεί η μελέτη και ανάπτυξη νέων μεθοδολογιών αυτόματης επεξεργασίας σεισμολογικών δεδομένων, µε σκοπό την επίλυση σημαντικών προβλημάτων που συναντώνται στα πεδία των επιστημών της σεισμολογίας και της γεωφυσικής όπως: 1) η ανίχνευση μικροσεισμικών γεγονότων από µία καταγραφή, µε άλλα λόγια ο διαχωρισμός της καταγραφής σε τμήματα που αποτελούνται από εδαφικό θόρυβο και σε τμήματα που περιέχουν την χρήσιμη πληροφορία (σεισμικά γεγονότα) για τους γεωεπιστήμονες και 2) η εκτίμηση των χρόνων άφιξης των διαμήκων (P-) καθώς και των εγκαρσίων (S-) σεισμικών φάσεων. Πιο αναλυτικά, η διατριβή είναι δομημένη ως εξής: Το πρώτο κεφάλαιο αποτελεί την εισαγωγή της διατριβής. Στο δεύτερο κεφάλαιο συγκεντρώνονται και κατηγοριοποιούνται όλες οι υπάρχουσες τεχνικές που έχουν αναπτυχθεί για την επίλυση του προβλήματος της αυτόματης ανίχνευσης σεισμικών γεγονότων καθώς και τον αυτόματο προσδιορισμό του χρόνου άφιξης των P και S σεισμικών φάσεων. Συγκεκριμένα γίνεται κατηγοριοποίηση αυτών σε τεχνικές που στηρίζονται στην ανάλυση και επεξεργασία των σεισμικών καταγραφών στα πεδία του χρόνου και της συχνότητας, στη χρήση νευρωνικών δικτύων, στην ανάλυση χρονικών σειρών και αυτοπαλινδρόμησης, στην ανάλυση της πόλωσης των κυμάτων, στις στατιστικές υψηλότερης τάξης, μεθόδους ασαφούς λογικής, κυματιδιακές μεθόδους κτλ. Στο τρίτο κεφάλαιο, αναπτύσσεται νέα τεχνική για την επίλυση του προβλήματος της αυτόματης ανίχνευσης σεισμικών γεγονότων από μία καταγραφή, η οποία βασίζεται σε μία μη αυστηρή διαδικασία ελέγχου υποθέσεων. Η προτεινόμενη τεχνική πραγματοποιείται σε δύο στάδια. Κατά το πρώτο στάδιο εκτιμώνται οι εμπειρικές συναρτήσεις πυκνότητας πιθανότητας που προκύπτουν τόσο από τον εδαφικό θόρυβο όσο και από τα υπόλοιπα που προέκυψαν από την λεύκανση αυτού. Κατά το δεύτερο στάδιο προτείνεται στατιστικό τεστ τύπου κατωφλίωσης για την αυτόματη ανίχνευση μικροσεισμικών γεγονότων. Η προτεινόμενη τεχνική εφαρμόζεται σε συνθετικά και πραγματικά δεδομένα και συγκρίνεται με τον γνωστό αλγόριθμο του λόγου βραχυπρόθεσμου προς μακροπρόθεσμο μέσο (STA/LTA). Στο τέταρτο κεφάλαιο παρουσιάζεται μέθοδος για την επίλυση του προβλήματος του αυτόματου προσδιορισμό του χρόνου άφιξης της P φάσης κάνοντας χρήση στατιστικών ανώτερης τάξης. Συγκεκριμένα, γίνεται χρήση των ποσοτήτων της λοξότητας, της κύρτωσης και μίας εκτίμησης της αντιεντροπίας ως γραμμικός συνδυασμός των παραπάνω. Επιπλέον παρουσιάζονται τα αποτελέσματα από την εφαρμογή της συγκεκριμένης τεχνικής σε συνθετικά αλλά και πραγματικά δεδομένα μικροσεισμικού δικτύου, κατάλληλα για χρήση στην παθητική σεισμική τομογραφία υψηλής ευκρίνειας. Τα αποτελέσματα αυτά συγκρίνονται με γνωστές ενεργειακές μεθόδους. Στο πέμπτο κεφάλαιο, αναπτύσσεται νέα τεχνική για την επίλυση του προβλήματος της αυτόματης εκτίμησης του χρόνου άφιξης της S φάσης. Η προτεινόμενη τεχνική βασίζεται στην στατιστική επεξεργασία συγκεκριμένης χαρακτηριστικής συνάρτησης, η οποία προκύπτει από τις ιδιότητες πόλωσης των σεισμικών κυμάτων που έχουν καταγραφεί. Επιπλέον, για να ελαττωθεί η εξάρτηση του προτεινόμενου αλγορίθμου από το χρησιμοποιούμενο παράθυρο, ακολουθείται μια πολυ-παραθυρική προσέγγιση του προβλήματος σε συνδυασμό με χρήση συναρτήσεων βαρών οι οποίες εκτιμώνται αυτόματα και βασίζονται στις μεταβολές της ενέργειας του σήματος κατά τη S άφιξη. Τέλος, παρουσιάζονται τα αποτελέσματα της εφαρμογής της μεθόδου σε πραγματικά δεδομένα καθώς και η αξιολόγησή τους σε περιβάλλον θορύβου. Στο έκτο κεφάλαιο, παρουσιάζονται τα αποτελέσματα της εφαρμογής των προτεινόμενων τεχνικών σε δεδομένα μικροσεισμικού δικτύου και συγκεκριμένα σε δεδομένα που προέκυψαν από πειράματα παθητικής σεισμικής τομογραφίας και τεχνητής υδραυλικής διάρρηξης που έλαβαν χώρα στην περιοχή Δέλβινο της ΝΔ Αλβανίας. Επιπλέον, γίνεται ανάλυση των αποτελεσμάτων βάσεις των δεικτών αβεβαιότητας που επέλεξαν οι αναλυτές στις εκτιμήσεις τους, καθώς και βάσει των λόγων σήματος θορύβου των καταγραφών. Στο έβδομο κεφάλαιο παρατίθενται τα συμπεράσματα της παρούσας διδακτορικής διατριβής, καθώς και πιθανές μελλοντικές προεκτάσεις. / The problems of seismic event detection and P- and S-phase arrival time estimation constitute important and vital tasks for the geoscientists. The solution of the aforementioned problems provides with important geophysical and seismological information, that can be used in a number of problems such as the structure of the earth’s interior, geotectonic settings, hypocentric and epicentric coordinates of an earthquake, the seismicity of an area and seismic hazard assessment. Traditionally, human experts have carried out this task. Nevertheless, during the last three decades due to the progress in computer technology, several methods have been developed for the automatic seismic event detection and P- and S- phase identification. After the introduction of the first chapter, in the second chapter the majority of the existing methods that have been developed and applied up to now, are gathered and categorized. These methods involve energy criteria, the seismic wave polarity assumption, artificial neural networks, higher order statistics, maximum likelihood methods, fuzzy logic methods etc. In the third chapter, a new thresholding type technique is proposed, tailored to fit real world situations where our knowledge on the statistical characteristics of the background noise process are unknown and a strict hypothesis testing framework can not be followed. In such cases the replacement of the unknown probability density function under the null hypothesis by its empirical counterpart, constitutes a possibility. In this work, a two stage procedure is proposed. The first one concerns the estimation of the empirical functions of the noise process itself as well as its whitened counterpart. In the second stage, using the above empirical functions, a thresholding scheme is proposed in order to solve the problem of the detection of seismic events in a non strict hypothesis testing framework. The performance of the proposed technique is confirmed by its application in a series of experiments both in synthetic and real seismic datasets. In the fourth chapter, the problem of automatic P-phase identification is solved using higher order statistics. The first- and second-order statistics (such as mean value, variance, autocorrelation, and power spectrum) are extensively used in signal processing to describe linear and Gaussian processes. In practice, many processes deviate from linearity and Gaussianity. Higher order statistics can be used for the study of such processes. The P-phase arrival time is estimated using these HOS parameters and additionally, an estimation of the negentropy defined as a linear combination of skewness and kurtosis. According to the implemented algorithm a moving window “slides” on the recorded signal, estimating skewness, kurtosis, and negentropy. Skewness can be considered as a measure of symmetry of the distribution, while kurtosis is a measure of heaviness of the tails, so they are suitable for detecting parts of the signal that do not follow the amplitude distribution of ambient noise. Seismic events have higher amplitudes in comparison to the seismic noise, and these higher values occupy the tails of the distribution (high degree of asymmetry of distribution). In the case of seismic events, skewness and kurtosis obtain high values, presenting maxima in the transition from ambient noise to the seismic events (P-arrival). The proposed algorithms are applied on synthetic as well as real seismic data and compared to well known energy based methods. Algorithms that deal with the automatic S-onset time identification problem, is a topic of ongoing research. Modern dense seismic networks used for earthquake location, seismic tomography investigations, source studies, early warning etc., demand accurate automatic S-wave picking. Most of the techniques that have been proposed up to now are mainly based on the polarization features of the seismic waves. In the fifth chapter, a new time domain method for the automatic determination of the S-phase arrival onsets is proposed and its implementation on local earthquake data is presented. Eigevalue analysis is taking place over small time intervals, and the maximum eigenvalue which is obtained on each step is retained for further processing. In this way a time series of maximum eigenvalues is formed, which serves as a characteristic function. A first S-phase arrival time estimation is obtained by applying the kurtosis criterion on the derived characteristic function. Furthermore, a multi-window approach combined with an energy-based weighting scheme is also applied, in order to reduce the algorithm’s dependence on the moving window’s length and provide a weighted S phase onset. Automatic picks are compared against manual reference picks and moreover the proposed technique is subjected to a noise robustness test. In the sixth chapter, the results of the implementation of the proposed techniques on microseismic data are presented. Specifically, the proposed methods are applied on two real sets of data. One dataset was been recorded during a Passive Seismic Tomography (PST) experiment, while the second one during the seismic monitoring of fracking operations. Both experiments took place in a hydrocarbon field in Delvina, SW Albania. These results are also analyzed, based on the arrival times and their uncertainty as they were evaluated by human analysts as well as the corresponding signal to noise ratio of the seismic records. Finally, the seventh chapter concludes this work and possible future extensions are discussed.
76

Diffusion de l’information dans les médias sociaux : modélisation et analyse / Information diffusion in social media : modeling and analysis

Guille, Adrien 25 November 2014 (has links)
Les médias sociaux ont largement modifié la manière dont nous produisons, diffusons et consommons l'information et sont de fait devenus des vecteurs d'information importants. L’objectif de cette thèse est d’aider à la compréhension du phénomène de diffusion de l’information dans les médias sociaux, en fournissant des moyens d’analyse et de modélisation.Premièrement, nous proposons MABED, une méthode statistique pour détecter automatiquement les évènements importants qui suscitent l'intérêt des utilisateurs des médias sociaux à partir du flux de messages qu'ils publient, dont l'originalité est d'exploiter la fréquence des interactions sociales entre utilisateurs, en plus du contenu textuel des messages. Cette méthode diffère par ailleurs de celles existantes en ce qu'elle estime dynamiquement la durée de chaque évènement, plutôt que de supposer une durée commune et fixée à l'avance pour tous les évènements. Deuxièmement, nous proposons T-BASIC, un modèle probabiliste basé sur la structure de réseau sous-jacente aux médias sociaux pour prédire la diffusion de l'information, plus précisément l'évolution du volume d'utilisateurs relayant une information donnée au fil du temps. Contrairement aux modèles similaires également basés sur la structure du réseau, la probabilité qu'une information donnée se diffuse entre deux utilisateurs n'est pas constante mais dépendante du temps. Nous décrivons aussi une procédure pour l'inférence des paramètres latents du modèle, dont l'originalité est de formuler les paramètres comme des fonctions de caractéristiques observables des utilisateurs. Troisièmement, nous proposons SONDY, un logiciel libre et extensible implémentant des méthodes tirées de la littérature pour la fouille et l'analyse des données issues des médias sociaux. Le logiciel manipule deux types de données : les messages publiés par les utilisateurs, et la structure du réseau social interconnectant ces derniers. Contrairement aux logiciels académiques existants qui se concentrent soit sur l'analyse des messages, soit sur l'analyse du réseau, SONDY permet d'analyser ces deux types de données conjointement en permettant l'analyse de l'influence par rapport aux évènements détectés. Les expérimentations menées à l'aide de divers jeux de données collectés sur le média social Twitter démontrent la pertinence de nos propositions et mettent en lumière des propriétés qui nous aident à mieux comprendre les mécanismes régissant la diffusion de l'information. Premièrement, en comparant les performances de MABED avec celles de méthodes récentes tirées de la littérature, nous montrons que la prise en compte des interactions sociales entre utilisateurs conduit à une détection plus précise des évènements importants, avec une robustesse accrue en présence de contenu bruité. Nous montrons également que MABED facilite l'interprétation des évènements détectés en fournissant des descriptions claires et précises, tant sur le plan sémantique que temporel. Deuxièmement, nous montrons la validité de la procédure proposée pour estimer les probabilités de diffusion sur lesquelles repose le modèle T-BASIC, en illustrant le pouvoir prédictif des caractéristiques des utilisateurs sélectionnées et en comparant les performances de la méthode d'estimation proposée avec celles de méthodes tirées de la littérature. Nous montrons aussi l'intérêt d'avoir des probabilités non constantes, ce qui permet de prendre en compte dans T-BASIC la fluctuation du niveau de réceptivité des utilisateurs des médias sociaux au fil du temps. Enfin, nous montrons comment, et dans quelle mesure, les caractéristiques sociales, thématiques et temporelles des utilisateurs affectent la diffusion de l'information. Troisièmement, nous illustrons à l'aide de divers scénarios l'utilité du logiciel SONDY, autant pour des non-experts, grâce à son interface utilisateur avancée et des visualisations adaptées, que pour des chercheurs du domaine, grâce à son interface de programmation. / Social media have greatly modified the way we produce, diffuse and consume information, and have become powerful information vectors. The goal of this thesis is to help in the understanding of the information diffusion phenomenon in social media by providing means of modeling and analysis.First, we propose MABED (Mention-Anomaly-Based Event Detection), a statistical method for automatically detecting events that most interest social media users from the stream of messages they publish. In contrast with existing methods, it doesn't only focus on the textual content of messages but also leverages the frequency of social interactions that occur between users. MABED also differs from the literature in that it dynamically estimates the period of time during which each event is discussed rather than assuming a predefined fixed duration for all events. Secondly, we propose T-BASIC (Time-Based ASynchronous Independent Cascades), a probabilistic model based on the network structure underlying social media for predicting information diffusion, more specifically the evolution of the number of users that relay a given piece of information through time. In contrast with similar models that are also based on the network structure, the probability that a piece of information propagate from one user to another isn't fixed but depends on time. We also describe a procedure for inferring the latent parameters of that model, which we formulate as functions of observable characteristics of social media users. Thirdly, we propose SONDY (SOcial Network DYnamics), a free and extensible software that implements state-of-the-art methods for mining data generated by social media, i.e. the messages published by users and the structure of the social network that interconnects them. As opposed to existing academic tools that either focus on analyzing messages or analyzing the network, SONDY permits the joint analysis of these two types of data through the analysis of influence with respect to each detected event.The experiments, conducted on data collected on Twitter, demonstrate the relevance of our proposals and shed light on some properties that give us a better understanding of the mechanisms underlying information diffusion. First, we compare the performance of MABED against those of methods from the literature and find that taking into account the frequency of social interactions between users leads to more accurate event detection and improved robustness in presence of noisy content. We also show that MABED helps with the interpretation of detected events by providing clearer textual description and more precise temporal descriptions. Secondly, we demonstrate the relevancy of the procedure we propose for estimating the pairwise diffusion probabilities on which T-BASIC relies. For that, we illustrate the predictive power of users' characteristics, and compare the performance of the method we propose to estimate the diffusion probabilities against those of state-of-the-art methods. We show the importance of having non-constant diffusion probabilities, which allows incorporating the variation of users' level of receptivity through time into T-BASIC. We also study how -- and in which proportion -- the social, topical and temporal characteristics of users impact information diffusion. Thirdly, we illustrate with various scenarios the usefulness of SONDY, both for non-experts -- thanks to its advanced user interface and adapted visualizations -- and for researchers -- thanks to its application programming interface.
77

Heterogeneous Multiscale Change-Point Inference and its Application to Ion Channel Recordings

Pein, Florian 20 October 2017 (has links)
No description available.
78

Sistema para monitoramento e análise de paisagens acústicas submarinas. / System for monitoring and analysing underwater acoustic landscapes.

Alexander Alvarez Rosario 14 October 2015 (has links)
O Monitoramento Acústico Passivo (PAM) submarino refere-se ao uso de sistemas de escuta e gravação subaquática, com o intuito de detectar, monitorar e identificar fontes sonoras através das ondas de pressão que elas produzem. Se diz que é passivo já que tais sistemas unicamente ouvem, sem perturbam o meio ambiente acústico existente, diferentemente de ativos, como os sonares. O PAM submarino tem diversas áreas de aplicação, como em sistemas de vigilância militar, seguridade portuária, monitoramento ambiental, desenvolvimento de índices de densidade populacional de espécies, identificação de espécies, etc. Tecnologia nacional nesta área é praticamente inexistente apesar da sua importância. Neste contexto, o presente trabalho visa contribuir com o desenvolvimento de tecnologia nacional no tema através da concepção, construção e operação de equipamento autônomo de PAM e de métodos de processamento de sinais para detecção automatizada de eventos acústicos submarinos. Foi desenvolvido um equipamento, nomeado OceanPod, que possui características como baixo custo de fabrica¸c~ao, flexibilidade e facilidade de configuração e uso, voltado para a pesquisa científica, industrial e para controle ambiental. Vários protótipos desse equipamento foram construídos e utilizados em missões no mar. Essas jornadas de monitoramento permitiram iniciar a criação de um banco de dados acústico, o qual permitiu fornecer a matéria prima para o teste de detectores de eventos acústicos automatizados e em tempo real. Adicionalmente também é proposto um novo método de detecção-identificação de eventos acústicos, baseado em análise estatística da representação tempo-frequência dos sinais acústicos. Este novo método foi testado na detecção de cetáceos, presentes no banco de dados gerado pelas missões de monitoramento. / Passive Acoustic Monitoring (PAM) refers to the use of systems to listen and record underwater soundscape, in order to detect, track and identify sound sources through the pressure waves that they produce. It is said to be passive as these systems only hear, not put noise in the environment, such as sonars. Underwater PAM has various application areas, such as military surveillance systems, port security, environmental monitoring, development of population density rates of species, species identification, etc. National technology in the field is practically nonexistent despite its importance. In this context, this paper aims to contribute to the national technology development in the field by designing, building, and operating a self-contained PAM equipment, also developing signal-processing methods for automated detection of underwater acoustic events. A device, named \"OceanPod\"which has characteristics such as low manufacturing cost, flexibility and ease of setup and use, intended for scientific, industrial research and environmental control was developed. Several prototypes of the equipment were built and used in several missions at seawaters. These missions monitoring, enabled start creating an acoustic database, which provided the raw material for the automated acoustic events detectors and realtime test. Additionally, it is also proposed a new method of detecting, identifying sound events, based on statistical analysis of the time-frequency representation of the acoustic signals. This new method has been tested in the detection of cetaceans present in the database generated by missions monitoring.
79

Método para detecção e compensação dos efeitos causados pela saturação dos TCs de proteção com meios adaptativos para mitigação da influência do ruído e dos desvios de frequência

Schettino, Bruno Montesano 08 December 2015 (has links)
Submitted by Renata Lopes (renatasil82@gmail.com) on 2016-04-12T19:44:42Z No. of bitstreams: 1 brunomontesanoschettino.pdf: 4192681 bytes, checksum: 0df0705f39a6ff58697ec7dc00759256 (MD5) / Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2016-04-24T03:29:36Z (GMT) No. of bitstreams: 1 brunomontesanoschettino.pdf: 4192681 bytes, checksum: 0df0705f39a6ff58697ec7dc00759256 (MD5) / Made available in DSpace on 2016-04-24T03:29:36Z (GMT). No. of bitstreams: 1 brunomontesanoschettino.pdf: 4192681 bytes, checksum: 0df0705f39a6ff58697ec7dc00759256 (MD5) Previous issue date: 2015-12-08 / Este trabalho propõe um método para detectar a saturação dos núcleos dos transformadores de corrente (TCs) utilizados na proteção de sistemas elétricos de potência (SEPs), além de promover a compensação de seus efeitos através da correção do sinal de corrente secundária distorcido pela saturação. Técnicas de processamento de sinais baseadas no filtro diferenciador de segunda ordem de Savitzky-Golay são utilizadas para localizar os pontos de transição entre partes do sinal de corrente distorcidas e não distorcidas pela saturação. Em seguida, um processo de estimação baseado no critério dos mínimos quadrados que utiliza exclusivamente amostras do sinal contidas nas regiões não distorcidas é efetuado, extraindo os parâmetros necessários à promoção da correção do sinal. As influências do ruído e dos desvios de frequência de operação do SEP foram analisadas, tendo sido desenvolvidos e incorporados meios adaptativos para mitigar seus efeitos. Os algoritmos desenvolvidos foram implementados em MATLAB e a avaliação de desempenho foi realizada utilizando sinais extraídos de simulações de falta ocorridas em um sistema modelado em um simulador digital em tempo real (RTDS). Os resultados indicaram que o método proposto atingiu desempenho satisfatório, independente dos parâmetros do TC e dentro de uma ampla gama de cenários da falta analisados. Além disso, o método mostrou-se robusto em relação ao ruído e eficiente na mitigação dos erros provocados pelos desvios de frequência. Ainda, os recursos técnicos e computacionais necessários para sua execução indicam que o método proposto é passível de implementação nos atuais dispositivos de proteção disponibilizados pela indústria. / This work proposes a method for detecting the saturation of the current-transformer cores used in the protection of electric power systems and promote the compensation for its effects by correcting the secondary current signal distorted due to the saturation. Signal processing techniques based on the second order differentiator Savitzky-Golay filter are used for locating the transition points between distorted and undistorted parts of the current signal. Then, an estimation process based on the least squares criteria that uses exclusively signal samples included in the undistorted regions is performed, extracting the parameters needed for the signal correction. The influences of the noise and the frequency offset were analysed, and adaptive means to mitigate their effects were developed and incorporated. The developed algorithms were implemented in MATLAB and performance evaluation was performed using the signals taken from fault simulations in a system modeled on a real time digital simulator (RTDS). The results indicated that the proposed method reaches a satisfactory performance, regardless of the CT parameters and within a wide range of analysed fault scenarios. Moreover, the method showed to be robust relative to the noise and effective in mitigating the errors due to the frequency offsets.
80

Towards better privacy preservation by detecting personal events in photos shared within online social networks / Vers une meilleure protection de la vie privée par la détection d'événements dans les photos partagées sur les réseaux sociaux

Raad, Eliana 04 December 2015 (has links)
De nos jours, les réseaux sociaux ont considérablement changé la façon dont les personnes prennent des photos qu’importe le lieu, le moment, le contexte. Plus que 500 millions de photos sont partagées chaque jour sur les réseaux sociaux, auxquelles on peut ajouter les 200 millions de vidéos échangées en ligne chaque minute. Plus particulièrement, avec la démocratisation des smartphones, les utilisateurs de réseaux sociaux partagent instantanément les photos qu’ils prennent lors des divers événements de leur vie, leurs voyages, leurs aventures, etc. Partager ce type de données présente un danger pour la vie privée des utilisateurs et les expose ensuite à une surveillance grandissante. Ajouté à cela, aujourd’hui de nouvelles techniques permettent de combiner les données provenant de plusieurs sources entre elles de façon jamais possible auparavant. Cependant, la plupart des utilisateurs des réseaux sociaux ne se rendent même pas compte de la quantité incroyable de données très personnelles que les photos peuvent renfermer sur eux et sur leurs activités (par exemple, le cas du cyberharcèlement). Cela peut encore rendre plus difficile la possibilité de garder l’anonymat sur Internet dans de nombreuses situations où une certaine discrétion est essentielle (politique, lutte contre la fraude, critiques diverses, etc.).Ainsi, le but de ce travail est de fournir une mesure de protection de la vie privée, visant à identifier la quantité d’information qui permettrait de ré-identifier une personne en utilisant ses informations personnelles accessibles en ligne. Premièrement, nous fournissons un framework capable de mesurer le risque éventuel de ré-identification des personnes et d’assainir les documents multimédias destinés à être publiés et partagés. Deuxièmement, nous proposons une nouvelle approche pour enrichir le profil de l’utilisateur dont on souhaite préserver l’anonymat. Pour cela, nous exploitons les évènements personnels à partir des publications des utilisateurs et celles partagées par leurs contacts sur leur réseau social. Plus précisément, notre approche permet de détecter et lier les évènements élémentaires des personnes en utilisant les photos (et leurs métadonnées) partagées au sein de leur réseau social. Nous décrivons les expérimentations que nous avons menées sur des jeux de données réelles et synthétiques. Les résultats montrent l’efficacité de nos différentes contributions. / Today, social networking has considerably changed why people are taking pictures all the time everywhere they go. More than 500 million photos are uploaded and shared every day, along with more than 200 hours of videos every minute. More particularly, with the ubiquity of smartphones, social network users are now taking photos of events in their lives, travels, experiences, etc. and instantly uploading them online. Such public data sharing puts at risk the users’ privacy and expose them to a surveillance that is growing at a very rapid rate. Furthermore, new techniques are used today to extract publicly shared data and combine it with other data in ways never before thought possible. However, social networks users do not realize the wealth of information gathered from image data and which could be used to track all their activities at every moment (e.g., the case of cyberstalking). Therefore, in many situations (such as politics, fraud fighting and cultural critics, etc.), it becomes extremely hard to maintain individuals’ anonymity when the authors of the published data need to remain anonymous.Thus, the aim of this work is to provide a privacy-preserving constraint (de-linkability) to bound the amount of information that can be used to re-identify individuals using online profile information. Firstly, we provide a framework able to quantify the re-identification threat and sanitize multimedia documents to be published and shared. Secondly, we propose a new approach to enrich the profile information of the individuals to protect. Therefore, we exploit personal events in the individuals’ own posts as well as those shared by their friends/contacts. Specifically, our approach is able to detect and link users’ elementary events using photos (and related metadata) shared within their online social networks. A prototype has been implemented and several experiments have been conducted in this work to validate our different contributions.

Page generated in 0.1093 seconds