• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 260
  • 49
  • 21
  • 9
  • 7
  • 6
  • 3
  • 1
  • 1
  • Tagged with
  • 378
  • 378
  • 378
  • 327
  • 164
  • 119
  • 68
  • 49
  • 37
  • 34
  • 33
  • 31
  • 30
  • 28
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

CircularTrip and ArcTrip:effective grid access methods for continuous spatial queries.

Cheema, Muhammad Aamir, Computer Science & Engineering, Faculty of Engineering, UNSW January 2007 (has links)
A k nearest neighbor query q retrieves k objects that lie closest to the query point q among a given set of objects P. With the availability of inexpensive location aware mobile devices, the continuous monitoring of such queries has gained lot of attention and many methods have been proposed for continuously monitoring the kNNs in highly dynamic environment. Multiple continuous queries require real-time results and both the objects and queries issue frequent location updates. Most popular spatial index, R-tree, is not suitable for continuous monitoring of these queries due to its inefficiency in handling frequent updates. Recently, the interest of database community has been shifting towards using grid-based index for continuous queries due to its simplicity and efficient update handling. For kNN queries, the order in which cells of the grid are accessed is very important. In this research, we present two efficient and effective grid access methods, CircularTrip and ArcTrip, that ensure that the number of cells visited for any continuous kNN query is minimum. Our extensive experimental study demonstrates that CircularTrip-based continuous kNN algorithm outperforms existing approaches in terms of both efficiency and space requirement. Moreover, we show that CircularTrip and ArcTrip can be used for many other variants of nearest neighbor queries like constrained nearest neighbor queries, farthest neighbor queries and (k + m)-NN queries. All the algorithms presented for these queries preserve the properties that they visit minimum number of cells for each query and the space requirement is low. Our proposed techniques are flexible and efficient and can be used to answer any query that is hybrid of above mentioned queries. For example, our algorithms can easily be used to efficiently monitor a (k + m) farthest neighbor query in a constrained region with the flexibility that the spatial conditions that constrain the region can be changed by the user at any time.
342

Stochastically optimized monocular vision-based navigation and guidance

Watanabe, Yoko 07 December 2007 (has links)
The objective of this thesis is to design a relative navigation and guidance system for unmanned aerial vehicles (UAVs) for vision-based control applications. The vision-based navigation, guidance and control has been one of the most focused on research topics for the automation of UAVs. This is because in nature, birds and insects use vision as the exclusive sensor for object detection and navigation. In particular, this thesis studies the monocular vision-based navigation and guidance. Since 2-D vision-based measurements are nonlinear with respect to the 3-D relative states, an extended Kalman filter (EKF) is applied in the navigation system design. The EKF-based navigation system is integrated with a real-time image processing algorithm and is tested in simulations and flight tests. The first closed-loop vision-based formation flight has been achieved. In addition, vision-based 3-D terrain recovery was performed in simulations. A vision-based obstacle avoidance problem is specially addressed in this thesis. A navigation and guidance system is designed for a UAV to achieve a mission of waypoint tracking while avoiding unforeseen stationary obstacles by using vision information. A 3-D collision criterion is established by using a collision-cone approach. A minimum-effort guidance (MEG) law is applied for a guidance design, and it is shown that the control effort can be reduced by using the MEG-based guidance instead of a conventional guidance law. The system is evaluated in a 6 DoF flight simulation and also in a flight test. For monocular vision-based control problems, vision-based estimation performance highly depends on the relative motion of the vehicle with respect to the target. Therefore, this thesis aims to derive an optimal guidance law to achieve a given mission under the condition of using the EKF-based relative navigation. Stochastic optimization is formulated to minimize the expected cost including the guidance error and the control effort. A suboptimal guidance law is derived based on an idea of the one-step-ahead (OSA) optimization. Simulation results show that the suggested guidance law significantly improves the guidance performance. Furthermore, the OSA optimization is generalized as the n-step-ahead optimization for an arbitrary number of n, and their optimality and computational cost are investigated.
343

CircularTrip and ArcTrip:effective grid access methods for continuous spatial queries.

Cheema, Muhammad Aamir, Computer Science & Engineering, Faculty of Engineering, UNSW January 2007 (has links)
A k nearest neighbor query q retrieves k objects that lie closest to the query point q among a given set of objects P. With the availability of inexpensive location aware mobile devices, the continuous monitoring of such queries has gained lot of attention and many methods have been proposed for continuously monitoring the kNNs in highly dynamic environment. Multiple continuous queries require real-time results and both the objects and queries issue frequent location updates. Most popular spatial index, R-tree, is not suitable for continuous monitoring of these queries due to its inefficiency in handling frequent updates. Recently, the interest of database community has been shifting towards using grid-based index for continuous queries due to its simplicity and efficient update handling. For kNN queries, the order in which cells of the grid are accessed is very important. In this research, we present two efficient and effective grid access methods, CircularTrip and ArcTrip, that ensure that the number of cells visited for any continuous kNN query is minimum. Our extensive experimental study demonstrates that CircularTrip-based continuous kNN algorithm outperforms existing approaches in terms of both efficiency and space requirement. Moreover, we show that CircularTrip and ArcTrip can be used for many other variants of nearest neighbor queries like constrained nearest neighbor queries, farthest neighbor queries and (k + m)-NN queries. All the algorithms presented for these queries preserve the properties that they visit minimum number of cells for each query and the space requirement is low. Our proposed techniques are flexible and efficient and can be used to answer any query that is hybrid of above mentioned queries. For example, our algorithms can easily be used to efficiently monitor a (k + m) farthest neighbor query in a constrained region with the flexibility that the spatial conditions that constrain the region can be changed by the user at any time.
344

Real-time vehicle performance monitoring with data integrity

Jenkins, William George, January 2006 (has links)
Thesis (M.S.) -- Mississippi State University. Department of Electrical and Computer Engineering. / Title from title screen. Includes bibliographical references.
345

Arquitetura para o desenvolvimento de unidades de medição fasorial sincronizada no monitoramento a nível de distribuição / Architecture for development of synchronized phasor measurement units in the distribution level monitoring

Grando, Flavio Lori 31 March 2016 (has links)
CAPES / Este trabalho tem por objetivo o desenvolvimento de uma arquitetura de baixo custo para construção de unidades de medição fasorial sincronizada (PMU). O dispositivo prevê conexão com a baixa tensão da rede elétrica, de forma que, instalada neste ponto do sistema permita o monitoramento da rede de transmissão e distribuição. Os desenvolvimentos deste projeto contemplam uma arquitetura completa, com módulo de instrumentação para uso na baixa tens˜ao da rede, módulo GPS para fornecer o sinal de sincronismo e etiqueta de tempo das medidas, unidade de processamento com sistema de aquisição, estimação de fasores e formatação dos dados de acordo com a norma e, por fim, módulo de comunicação para transmissão dos dados. Para o desenvolvimento e avaliação do desempenho da arquitetura, desenvolveu-se um conjunto de aplicativos em ambiente LabVIEW com funcionalidades específicas que permitem analisar o comportamento das medidas e identificar as fontes de erro da PMU, além de aplicar todos os testes previstos pela norma IEEE C37.118.1. O primeiro aplicativo, útil para o desenvolvimento da instrumentação, consiste em um gerador de funções integrado com osciloscópio, que permite a geração e aquisição de sinais de forma sincronizada, além da manipulação das amostras. O segundo e principal deles, é a plataforma de testes capaz de gerar todos os ensaios previstos pela norma, permitindo também armazenar os dados ou fazer a análise das medidas em tempo real. Por fim, um terceiro aplicativo foi desenvolvido para avaliar os resultados dos testes e gerar curvas de ajuste para calibração da PMU. Os resultados contemplam todos os testes previstos pela norma e um teste adicional que avalia o impacto de ruído. Além disso, através de dois protótipos conectados à instalação elétrica de consumidores de um mesmo circuito de distribuição, obteve-se registros de monitoramento que permitiram a identificação das cargas no consumidor, análise de qualidade de energia, além da detecção de eventos a nível de distribuição e transmissão. / This work presents a low cost architecture for development of synchronized phasor measurement units (PMU). The device is intended to be connected in the low voltage grid, which allows the monitoring of transmission and distribution networks. Developments of this project include a complete PMU, with instrumentation module for use in low voltage network, GPS module to provide the sync signal and time stamp for the measures, processing unit with the acquisition system, phasor estimation and formatting data according to the standard and finally, communication module for data transmission. For the development and evaluation of the performance of this PMU, it was developed a set of applications in LabVIEW environment with specific features that let analyze the behavior of the measures and identify the sources of error of the PMU, as well as to apply all the tests proposed by the standard. The first application, useful for the development of instrumentation, consists of a function generator integrated with an oscilloscope, which allows the generation and acquisition of signals synchronously, in addition to the handling of samples. The second and main, is the test platform, with capabality of generating all tests provided by the synchronized phasor measurement standard IEEE C37.118.1, allowing store data or make the analysis of the measurements in real time. Finally, a third application was developed to evaluate the results of the tests and generate calibration curves to adjust the PMU. The results include all the tests proposed by synchrophasors standard and an additional test that evaluates the impact of noise. Moreover, through two prototypes connected to the electrical installation of consumers in same distribution circuit, it was obtained monitoring records that allowed the identification of loads in consumer and power quality analysis, beyond the event detection at the distribution and transmission levels.
346

Desenvolvimento de um sistema distribuído de identificação em tempo real de parâmetros de qualidade de energia elétrica

Menezes, Ramon Maciel 29 February 2012 (has links)
CNPq, CAPES / O presente trabalho inclui a revisão das normas de qualidade de energia elétrica, a fim de normatizar o desenvolvimento do projeto seguindo normas nacionais e internacionais; a simulação de algoritmos como CFA e FFT, a fim de verificar a viabilidade de seu uso, bem como as limitações associadas ao processamento de formas de onda fortemente distorcidas. Inclui também a proposição e a verificação de um algoritmo capaz de calcular os índices (selecionados durante a revisão das normas) que pudessem avaliar a qualidade de energia através de sinais de tensão e corrente. Para o desenvolvimento do protótipo, foram selecionados sensores de tensão e de corrente confiáveis para o sistema de aquisição; um DSP, que executa os algoritmos previamente simulados, processando em tempo real os sinais adquiridos pelos sensores, a fim de reportar o estado da rede elétrica e/ou eventos ocorridos na rede através de um módulo ZigBee, responsável pela transmissão desses dados de forma segura. A classe de eventos de variação de tensão de curta duração foi incluída no processamento em tempo real realizado pelo DSP. Devido à imprevisibilidade e à rapidez da ocorrência desses eventos, foi desenvolvida uma ferramenta capaz de gerar essa classe de eventos, o gerador de VTCD. A análise de QEE em tempo real se mostrou viável mesmo com a utilização de dispositivos de baixo custo, permitindo, ainda que com algumas limitações, o levantamento de informações de QEE às quais cargas conhecidas estavam submetidas. / The present document includes a comprehensive literature review on power quality issues, to keep the development of this project aligned with national and international standards related; simulation algorithms such as FFT and CFA in order to verify the feasibility of its use, as well as limitations associated with the processing of strongly distorted waveform. It also includes the proposal and verification of an algorithm able to calculate the indices (selected during the standards review) that could assess the power quality through voltage and current signals. For prototype development, voltage and current sensors were selected for reliable acquisition system; a DSP, which running the previously simulated algorithms in order to process in real time the acquired voltage and current signals provided by sensors in order to report the status of the mains grid and/or events occurrence on the network through a ZigBee module, responsible for safety transmission data. The short term voltage change events class was also included in the real time processing performed by the DSP. Due to the unpredictability and short duration of these events, it was developed a tool capable of generating this class of events, the STVC generator. The PQ analysis in real time was feasible even with the use of low cost devices, allowing, although with some limitations, the survey of PQ information which known loads was submitted.
347

Ambiente para aquisição e processamento de sinal eletromiográfico de superfície / Environment to acquisition and processing of surface electromyography signal

Magagnin Junior, Ari 22 December 2015 (has links)
Nesse trabalho foi desenvolvida uma plataforma para o condicionamento, digitalização, visualização e gravação de sinais de eletromiografia (EMG). Posteriormente a aquisição, a análise pode ser realizada através de técnicas de processamento de sinais. A plataforma consiste em dois módulos que adquirem sinais de EMG através de eletrodos de superfície, limitam a faixa de frequências de interesse, filtram interferências da rede elétrica e digitalizam os sinais pelo conversor analógico-digital do microcontrolador dos módulos. Deste modo, os dados são enviados para o computador pela interface USB na especificação HID, sendo apresentados em tempo real na forma gráfica e armazenados em arquivo. Como recursos de processamento foram implementadas as operações de módulo do sinal, a determinação do valor eficaz (RMS), a análise de Fourier, filtro digital (IIR) e filtro adaptativo. Foram realizados testes iniciais de avaliação da plataforma com sinais de membros inferiores e superiores para fins de comparação de lateralidade de sinais de EMG. A plataforma aberta destina-se a atividades didáticas e a pesquisa acadêmica, permitindo acrescentar outros métodos de processamento que o pesquisador tenha interesse em avaliar ou outras análises que se façam necessárias. / In this work, a platform to the conditioning, digitizing, visualization and recording of the EMG signals was developed. After the acquisition, the analysis can be done by signal processing techniques. The platform consists of two modules witch acquire electromyography (EMG) signals by surface electrodes, limit the interest frequency band, filter the power grid interference and digitalize the signals by the analogue-to- digital converter of the modules microcontroller. Thereby, the data are sent to the computer by the USB interface by the HID specification, displayed in real-time in graphical form and stored in files. As processing resources was implemented the operations of signal absolute value, the determination of effective value (RMS), Fourier analysis, digital filter (IIR) and the adaptive filter. Platform initial tests were performed with signal of lower and upper limbs with the aim to compare the EMG signal laterality. The open platform is intended to educational activities and academic research, allowing the addition of other processing methods that the researcher want to evaluate or other required analysis.
348

Desenvolvimento de um sistema distribuído de identificação em tempo real de parâmetros de qualidade de energia elétrica

Menezes, Ramon Maciel 29 February 2012 (has links)
CNPq, CAPES / O presente trabalho inclui a revisão das normas de qualidade de energia elétrica, a fim de normatizar o desenvolvimento do projeto seguindo normas nacionais e internacionais; a simulação de algoritmos como CFA e FFT, a fim de verificar a viabilidade de seu uso, bem como as limitações associadas ao processamento de formas de onda fortemente distorcidas. Inclui também a proposição e a verificação de um algoritmo capaz de calcular os índices (selecionados durante a revisão das normas) que pudessem avaliar a qualidade de energia através de sinais de tensão e corrente. Para o desenvolvimento do protótipo, foram selecionados sensores de tensão e de corrente confiáveis para o sistema de aquisição; um DSP, que executa os algoritmos previamente simulados, processando em tempo real os sinais adquiridos pelos sensores, a fim de reportar o estado da rede elétrica e/ou eventos ocorridos na rede através de um módulo ZigBee, responsável pela transmissão desses dados de forma segura. A classe de eventos de variação de tensão de curta duração foi incluída no processamento em tempo real realizado pelo DSP. Devido à imprevisibilidade e à rapidez da ocorrência desses eventos, foi desenvolvida uma ferramenta capaz de gerar essa classe de eventos, o gerador de VTCD. A análise de QEE em tempo real se mostrou viável mesmo com a utilização de dispositivos de baixo custo, permitindo, ainda que com algumas limitações, o levantamento de informações de QEE às quais cargas conhecidas estavam submetidas. / The present document includes a comprehensive literature review on power quality issues, to keep the development of this project aligned with national and international standards related; simulation algorithms such as FFT and CFA in order to verify the feasibility of its use, as well as limitations associated with the processing of strongly distorted waveform. It also includes the proposal and verification of an algorithm able to calculate the indices (selected during the standards review) that could assess the power quality through voltage and current signals. For prototype development, voltage and current sensors were selected for reliable acquisition system; a DSP, which running the previously simulated algorithms in order to process in real time the acquired voltage and current signals provided by sensors in order to report the status of the mains grid and/or events occurrence on the network through a ZigBee module, responsible for safety transmission data. The short term voltage change events class was also included in the real time processing performed by the DSP. Due to the unpredictability and short duration of these events, it was developed a tool capable of generating this class of events, the STVC generator. The PQ analysis in real time was feasible even with the use of low cost devices, allowing, although with some limitations, the survey of PQ information which known loads was submitted.
349

Ambiente para aquisição e processamento de sinal eletromiográfico de superfície / Environment to acquisition and processing of surface electromyography signal

Magagnin Junior, Ari 22 December 2015 (has links)
Nesse trabalho foi desenvolvida uma plataforma para o condicionamento, digitalização, visualização e gravação de sinais de eletromiografia (EMG). Posteriormente a aquisição, a análise pode ser realizada através de técnicas de processamento de sinais. A plataforma consiste em dois módulos que adquirem sinais de EMG através de eletrodos de superfície, limitam a faixa de frequências de interesse, filtram interferências da rede elétrica e digitalizam os sinais pelo conversor analógico-digital do microcontrolador dos módulos. Deste modo, os dados são enviados para o computador pela interface USB na especificação HID, sendo apresentados em tempo real na forma gráfica e armazenados em arquivo. Como recursos de processamento foram implementadas as operações de módulo do sinal, a determinação do valor eficaz (RMS), a análise de Fourier, filtro digital (IIR) e filtro adaptativo. Foram realizados testes iniciais de avaliação da plataforma com sinais de membros inferiores e superiores para fins de comparação de lateralidade de sinais de EMG. A plataforma aberta destina-se a atividades didáticas e a pesquisa acadêmica, permitindo acrescentar outros métodos de processamento que o pesquisador tenha interesse em avaliar ou outras análises que se façam necessárias. / In this work, a platform to the conditioning, digitizing, visualization and recording of the EMG signals was developed. After the acquisition, the analysis can be done by signal processing techniques. The platform consists of two modules witch acquire electromyography (EMG) signals by surface electrodes, limit the interest frequency band, filter the power grid interference and digitalize the signals by the analogue-to- digital converter of the modules microcontroller. Thereby, the data are sent to the computer by the USB interface by the HID specification, displayed in real-time in graphical form and stored in files. As processing resources was implemented the operations of signal absolute value, the determination of effective value (RMS), Fourier analysis, digital filter (IIR) and the adaptive filter. Platform initial tests were performed with signal of lower and upper limbs with the aim to compare the EMG signal laterality. The open platform is intended to educational activities and academic research, allowing the addition of other processing methods that the researcher want to evaluate or other required analysis.
350

Active XML Data Warehouses for Intelligent, On-line Decision Support / Entrepôts de données XML actifs pour la décision intelligente en ligne

Salem, Rashed 23 March 2012 (has links)
Un système d'aide à la décision (SIAD) est un système d'information qui assiste lesdécideurs impliqués dans les processus de décision complexes. Les SIAD modernesont besoin d'exploiter, en plus de données numériques et symboliques, des donnéeshétérogènes (données texte, données multimédia, ...) et provenant de sources diverses(comme le Web). Nous qualifions ces données complexes. Les entrepôts dedonnées forment habituellement le socle des SIAD. Ils permettent d'intégrer des données provenant de diverses sources pour appuyer le processus décisionnel. Cependant, l'avènement de données complexes impose une nouvelle vision de l'entreposagedes données, y compris de l'intégration des données, de leur stockage et de leuranalyse. En outre, les exigences d'aujourd'hui imposent l'intégration des donnéescomplexes presque en temps réel, pour remplacer le processus ETL traditionnel(Extraction, Transformation et chargement). Le traitement en temps réel exige unprocessus ETL plus actif. Les tâches d'intégration doivent réagir d'une façon intelligente, c'est-à-dire d'une façon active et autonome pour s'adapter aux changementsrencontrés dans l'environnement d'intégration des données, notamment au niveaudes sources de données.Dans cette thèse, nous proposons des solutions originales pour l'intégration dedonnées complexes en temps réel, de façon active et autonome. En eet, nous avons conçu une approche générique basé sur des métadonnées, orientée services et orienté évènements pour l'intégration des données complexes. Pour prendre en charge lacomplexité des données, notre approche stocke les données complexes à l'aide d'unformat unie en utilisant une approche base sur les métadonnées et XML. Nous noustraitons également la distribution de données et leur l'interopérabilité en utilisantune approche orientée services. Par ailleurs, pour considérer le temps réel, notreapproche stocke non seulement des données intégrées dans un référentiel unie,mais présente des fonctions d'intégration des données a la volée. Nous appliquonségalement une approche orientée services pour observer les changements de donnéespertinentes en temps réel. En outre, l'idée d'intégration des données complexes defaçon active et autonome, nous proposons une méthode de fouille dans les évènements.Pour cela, nous proposons un algorithme incrémentiel base sur XML pourla fouille des règles d'association a partir d’évènements. Ensuite, nous denissonsdes règles actives a l'aide des données provenant de la fouille d'évènements an deréactiver les tâches d'intégration.Pour valider notre approche d'intégration de données complexes, nous avons développé une plateforme logicielle, à savoir AX-InCoDa ((Active XML-based frameworkfor Integrating Complex Data). AX-InCoDa est une application Web implémenté à l'aide d'outils open source. Elle exploite les standards du Web (comme les services Web et XML) et le XML actif pour traiter la complexité et les exigences temps réel. Pour explorer les évènements stockés dans base d'évènement, nous avons proposons une méthode de fouille d'évènements an d'assurer leur autogestion.AX-InCoDa est enrichi de règles actives L'ecacite d'AX-InCoDa est illustrée par une étude de cas sur des données médicales. En, la performance de notre algorithme de fouille d'évènements est démontrée expérimentalement. / A decision support system (DSS) is an information system that supports decisionmakers involved in complex decision-making processes. Modern DSSs needto exploit data that are not only numerical or symbolic, but also heterogeneouslystructured (e.g., text and multimedia data) and coming from various sources (e.g,the Web). We term such data complex data. Data warehouses are casually usedas the basis of such DSSs. They help integrate data from a variety of sourcesto support decision-making. However, the advent of complex data imposes anothervision of data warehousing including data integration, data storage and dataanalysis. Moreover, today's requirements impose integrating complex data in nearreal-time rather than with traditional snapshot and batch ETL (Extraction, Transformationand Loading). Real-time and near real-time processing requires a moreactive ETL process. Data integration tasks must react in an intelligent, i.e., activeand autonomous way, to encountered changes in the data integration environment,especially data sources.In this dissertation, we propose novel solutions for complex data integration innear real-time, actively and autonomously. We indeed provide a generic metadatabased,service-oriented and event-driven approach for integrating complex data.To address data complexity issues, our approach stores heterogeneous data into aunied format using a metadata-based approach and XML. We also tackle datadistribution and interoperability using a service-oriented approach. Moreover, toaddress near real-time requirements, our approach stores not only integrated datainto a unied repository, but also functions to integrate data on-the-y. We also apply a service-oriented approach to track relevant data changes in near real-time.Furthermore, the idea of integrating complex data actively and autonomously revolvesaround mining logged events of data integration environment. For this sake,we propose an incremental XML-based algorithm for mining association rules fromlogged events. Then, we de ne active rules upon mined data to reactivate integrationtasks.To validate our approach for managing complex data integration, we develop ahigh-level software framework, namely AX-InCoDa (Active XML-based frameworkfor Integrating Complex Data). AX-InCoDa is implemented as Web application usingopen-source tools. It exploits Web standards (e.g., XML and Web services) andActive XML to handle complexity issues and near real-time requirements. Besidewarehousing logged events into an event repository to be mined for self-managingpurposes, AX-InCoDa is enriched with active rules. AX-InCoDa's feasibility is illustratedby a healthcare case study. Finally, the performance of our incremental eventmining algorithm is experimentally demonstrated.

Page generated in 0.0612 seconds