• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 81
  • 20
  • 20
  • 15
  • 8
  • 4
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 169
  • 101
  • 100
  • 32
  • 31
  • 28
  • 21
  • 19
  • 19
  • 18
  • 16
  • 14
  • 14
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Framework to Evaluate Entropy Based Data Fusion Methods in Supply Chain Management

Tran, Huong Thi 12 1900 (has links)
This dissertation explores data fusion methodology to deduce an overall inference from the data gathered from multiple heterogeneous sources. Typically, if there existed a data source in which the data were reliable and unbiased, then data fusion would not be necessary. Data fusion methodology combines data form multiple diverse sources so that the desired information - such as the population mean - is improved despite redundancies, inaccuracies, biases, and inflated variability in the data. Examples of data fusion include estimating average demand from similar sources, and integrating fatality counts from different media sources after a catastrophe. The approach in this study combines "inputs" from distinct sources so that the information is "fused." Another way of describing this process is "data integration." Important assumptions are 1. Several sources provide "inputs" for information used to estimate parameters of a probability distribution. 2. Since distributions for the data from the sources are heterogeneous, some sources are less reliable. 3. Distortions, bias, censorship, and systematic errors may be more prominent in data from certain sources. 4. The sample size of sources data, number of "inputs," may be very small. Examples of information from multiple sources are abundant: traffic information from sensors at intersections, multiple economic indicators from various sources, demand data for product using similar retail stores as sources, polling data from various sources, and disaster count of fatalities from different media sources after a catastrophic event. This dissertation seeks to address a gap in the operations literature by addressing three research questions regarding entropy base data fusion (EBDF) approaches to estimation. Three separate, but unifying, essays address the research questions for this dissertation. Essay 1 provides an overview of supporting literature for the research questions. A numerical analysis of airline maximum wait time data illustrates the underlying issues involved in EBDF methods. This essay addresses the research question: Why consider alternative entropy-based weighting methods? Essay 2 introduces 13 data fusion methods. A Monte Carlo simulation study examines the performance of these methods in estimating the mean parameter of a population with either a normal or lognormal distribution. This essay addresses the following research questions: 1. Can an alternative formulation for Shannon's entropy enhance the performance of Sheu (2010)'s data fusion approach? 2. Do symmetric and skewed distributions affect the 13 data fusion methods differently? 3. Do negative and positive biases affect the performance of the 13 methods differently? 4. Do entropy based data fusion methods outperform non-entropy based data fusion methods? 5. Which data fusion methods are recommended for symmetric and skewed data sets when no bias is present? What is the recommendation under conditions of few data sources? Essay 3 explores the use of the data fusion method estimates of the population mean in a newsvendor problem. A Monte Carlo simulation study investigates the accuracy of the using the estimates provided in Essay 2 as the parameter estimate for the distribution of demand that follows an exponential distribution. This essay addresses the following research questions: 1. Do data fusion methods with relatively strong performance in estimating the parameter mean estimate also provide relatively strong performance in estimating the optimal demand under a given ratio of overage and underage costs? 2. Do any of the data fusion methods deteriorate or improve with the introduction of positive and negative bias? 3. Do the alternative entropy formulations to Shannon's entropy enhance the performance of the methods on a relative basis? 4. Is the relative rank ordering performance of the data fusion methods different in Essay 2 and Essay 3 in the resulting performances of the methods? The contribution of this research is to introduce alternative EBDF methods, and to establish a framework for using EBDF methods in supply chain decision making. A comparative Monte Carlo simulation analysis study will provide a basis to investigate the robustness of the proposed data fusion methods for estimation of population parameters in a newsvendor problem with known distribution, but unknown parameter. A sensitivity analysis is conducted to determine the effect of multiple sources, sample size, and distributions.
142

Nástroj pro 3D rekonstrukci z dat z více typů senzorů / Scalable Multisensor 3D Reconstruction Framework

Šolony, Marek January 2017 (has links)
Realistické 3D modely prostředí jsou užitečné v mnoha oborech, od inspekce přírodních struktur nebo budov, navigace robotů a tvorby map až po filmový průmysl při zaměřování scény nebo pro integraci speciálních efektů. Je běžné při snímání takové scény použít různých typů senzorů, jako například monokulární, stereoskopické nebo sférické kamery nebo 360° laserové skenery, pro dosažení velkého pokrytí scény. Výhoda laserových skenerů a sférických kamer spočívá právě v zachycení celého okolí jako jeden celistvý snímek. Použitím konvenčních monokulárních kamer lze naproti tomu snadno pokrýt zastíněné části scény nebo zachytit detaily. Proces 3D rekonstrukce sestává ze tří kroků: snímání, zpracování dat a registrace a zpřesnění rekonstrukce. Přínos této disertační práce je podrobná analýza metod registrace obrazu ze sférických a planárních kamer a implementace unifikovaného systému sensorů a měření pro 3D rekonstrukci, jež umožňuje rekonstrukci ze všech dostupných dat. Hlavní výhodou navržené unifikované reprezentace je, že umožňuje společně optimalizovat všechny pózy sensorů a bodů scény aplikací nelineárních optimalizačních metod. Tím dosahuje lepší přesnosti rekonstrukce aniž by se výrazně zvýšily výpočetní nároky.
143

Development of distributed control system for SSL soccer robots

Holtzhausen, David Schalk 03 1900 (has links)
Thesis (MScEng)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: This thesis describes the development of a distributed control system for SSL soccer robots. The project continues on work done to develop a robotics research platform at Stellenbosch University. The wireless communication system is implemented using Player middleware. This enables high level programming of the robot drivers and communication clients, resulting in an easily modifiable system. The system is developed to be used as either a centralised or decentralised control system. The software of the robot’s motor controller unit is updated to ensure optimal movement. Slippage of the robot’s wheels restricts the robot’s movement capabilities. Trajectory tracking software is developed to ensure that the robot follows the desired trajectory while operating within its physical limits. The distributed control architecture reduces the robots dependency on the wireless network and the off-field computer. The robots are given some autonomy by integrating the navigation and control on the robot self. Kalman filters are designed to estimate the robots translational and rotational velocities. The Kalman filters fuse vision data from an overhead vision system with inertial measurements of an on-board IMU. This ensures reliable and accurate position, orientation and velocity information on the robot. Test results show an improvement in the controller performance as a result of the proposed system. / AFRIKAANSE OPSOMMING: Hierdie tesis beskryf die ontwikkeling van ’n verspreidebeheerstelsel vir SSL sokker robotte. Die projek gaan voort op vorige werk wat gedoen is om ’n robotika navorsingsplatform aan die Universiteit van Stellenbosch te ontwikkel. Die kommunikasiestelsel is geïmplementeer met behulp van Player middelware. Dit stel die robotbeheerders en kommunikasiekliënte in staat om in hoë vlak tale geprogrameer te word. Dit lei tot ’n maklik veranderbare stelsel. Die stelsel is so ontwikkel dat dit gebruik kan word as óf ’n gesentraliseerde of verspreidebeheerstelsel. Die sagteware van die motorbeheer eenheid is opgedateer om optimale robot beweging te verseker. As die robot se wiele gly beperk dit die robot se bewegingsvermoëns. Trajekvolgings sagteware is ontwikkel om te verseker dat die robot die gewenste pad volg, terwyl dit binne sy fisiese operasionele grense bly. Die verspreibeheerargitektuur verminder die robot se afhanklikheid op die kommunikasienetwerk en die sentrale rekenaar. Die robot is ’n mate van outonomie gegee deur die integrasie van die navigasie en beheer op die robot self te doen. Kalman filters is ontwerp om die robot se translasie en rotasie snelhede te beraam. Die Kalman filters kombineer visuele data van ’n oorhoofse visiestelsel met inertia metings van ’n IMU op die robot. Dit verseker betroubare en akkurate posisie, oriëntasie en snelheids inligting. Toetsresultate toon ’n verbetering in die beheervermoë as ’n gevolg van die voorgestelde stelsel.
144

Fusion distribuée de données échangées dans un réseau de véhicules / Distributed data fusion in VANETS

El Zoghby, Nicole 19 February 2014 (has links)
Cette thèse porte sur l'étude des techniques de fusion de données réparties et incertaines au sein d’un réseau de véhicules pour gérer la confiance dans les autres véhicules ou dans les données reçues. L'algorithme de fusion distribuée proposé est basé sur les fonctions de croyance et est appliqué par chaque nœud à la réception des messages. In se base sur la gestion d'une connaissance directe, locale à chaque nœud et d'une connaissance distribuée diffusée dans le réseau. Cette dernière résulte de la fusion des messages par un opérateur adapté prenant en compte les cycles éventuels et limitant l'effet de "data incest". Chaque nœud peut être autonome pour estimer la confiance mais la coopération entre les véhicules permet d'améliorer et de rendre plus robuste cette estimation. L'algorithme peut être adapté au cas d'étude en considérant un ou plusieurs éléments d'observation et en prenant en compte l'obsolescence des données. Lorsqu'il y a plusieurs éléments d'observation, se pose le problème de l'association de données nécessaire avant l'étape de combinaison. Un nouvel algorithme d'association a été formalisé dans le cadre des fonctions de croyance. Il a été démontré que ce problème est équivalent à un problème d'affectation linéaire, qui peut être résolu en temps polynomial. Cette solution est à la fois optimale et beaucoup plus efficace que d'autres approches développées dans ce formalisme. La gestion de la confiance dans les nœuds et dans les données échangées ont été illustrées par la mise en œuvre de deux applications : la détection de faux nœuds dans une attaque Sybil et la gestion de la confiance dans les cartes dynamiques pour la perception augmentée. / This thesis focuses on the study of fusion techniques for distributed and uncertain data in a vehicle network in order to manage the confidence in other vehicles or in received data. The proposed distributed fusion algorithm is based on belief functions and is applied by each node when it receives messages. It is based on the management of direct knowledge, local for each node, and the management of a distributed knowledge broadcasted over the network. The distributed knowledge is the result of the fusion of messages by a suitable operator taking into account the possible cycles and limiting the effect of "data incest". Each node can be autonomous to estimate confidence but cooperation between vehicles can improve and make more robust this estimation. The algorithm can be adapted to the case of study by considering one or more elements of observation and taking into account the data obsolescence. When there are multiple elements of observation, the data association is necessary before the combination step. A new association algorithm was formalized in the framework of belief functions.It has been shown that this problem is equivalent to a linear assignment problem which can be solved in polynomial time. This solution is both optimal and more effective than other approaches developed in this formalism. The confidence management in the nodes and in the received data were illustrated by the implementation of two applications : the detection of false nodes in a Sybil attack and the distributed dynamic maps for enhanced perception
145

Modelo para apoio à decisão no processo de classificação de unidades móveis no cenário marítimo.

Cleber Almeida de Oliveira 20 November 2009 (has links)
A habilidade de conduzir operações no mar é determinada, fundamentalmente, pela organização, compreensão e projeção de todos os fatores importantes que influenciam a consciência situacional, tais como a disposição das forças aliadas, neutras e entidades não combatentes, as características geográficas, oceanográficas e meteorológicas da área de interesse e, obviamente, a disposição do inimigo. No sistema de comando e controle (C2) naval, o processo de organização, compreensão e projeção destes fatores é conhecido como compilação do quadro tático. A compilação de um quadro tático confiável em ambiente de tráfego marítimo denso requer a tecnologia de fusão de dados para integrar informações providas por múltiplas fontes de modo a melhorar a percepção da situação pelo decisor. Este trabalho propõe e implementa um modelo de apoio à decisão para o processo de fusão de dados com o propósito de elevar a capacidade de compilação do quadro tático marítimo. Este modelo incorpora os aspectos cognitivos dos especialistas, os aspectos doutrinários e os objetivos considerados por ocasião da definição de uma missão. A implementação do modelo de apoio à decisão foi estruturada em duas partes: A primeira parte propõe uma abordagem que conjuga as técnicas de gating e de lógica nebulosa para o problema de associação de dados, considerando múltiplos sensores e múltiplos alvos. A segunda parte propõe o emprego das Tabelas de Decisão Fuzzy (TDF) como estrutura para a construção das regras de um sistema especialista. Esta abordagem agrega as heurísticas empregadas pelos especialistas e as regras de engajamento para apoiar a classificação e a identificação de alvos de superfície e para sugerir algumas ações a serem tomadas pelo decisor. Os benefícios operacionais obtidos foram significativos. Reduziu-se a interferência humana no processo de integração dos dados, minimizando a dependência no conhecimento, no treinamento e na experiência do operador do sensor para compilar o quadro tático. Aprimoraram-se as fases de observação e de orientação do modelo por meio da minimização da redundância dos dados e pela otimização do processamento dos dados de modo que o decisor visualize o cenário em uma única camada de informação, aprimorando a consciência situacional. Contribuiu-se, também, para as fases de decisão e de atuação do modelo, com a atribuição do grau de confiança na declaração de identidade do contato e com a sugestão de algumas ações a serem tomadas pelo decisor.
146

Exploitation du conflit entre capteurs pour la gestion d'un système complexe multi-capteurs / Exploitation of conflict between sensors for the management of a complex multisensor system

Roquel, Arnaud 12 December 2012 (has links)
Les systèmes complexes intègrent aujourd’hui de nombreux capteurs physiques ou logiques qui facilitent la prise de décisions optimales en fonction de l’exosystème et de l’endosystème. Ces capteurs sont des sources de données, qui délivrent des informations partielles, imprécises et/ou incertaines, partiellement complémentaires et partiellement redondantes. La théorie des fonctions de croyances offre un cadre formel adapté à la représentation de l’imprécision et de l’incertitude des informations. Cependant, même en modélisant l’ignorance et l’imprécision des sources, l’absence de conflit entre les sources n’est toutefois pas garantie.Dans la théorie des fonctions de croyances, le désaccord entre sources est classiquement mesuré en termes de conflit ‘Dempsterien’, celui résultant de la combinaison conjonctive des sources, ou de dis-similarité ou distances entre fonctions de croyance. Toutes ces mesures sont globales, et ne donnent pas d’information directe sur la source du conflit.La contribution principale de cette thèse est de décomposer le conflit Dempsterien dans le but d'analyser celui-ci. Nous proposons une décomposition par rapport aux différentes hypothèses simples ou composées, issues de l'espace de discernement. Nous montrons l’unicité de cette décomposition et explicitons l’algorithme de calcul, à partir de la décomposition canonique de la fonction de croyance. Nous interprétons alors chacun des termes de la décomposition comme la contribution au conflit global, apportée par chaque hypothèse simple ou composée. Cette décomposition s’applique à l’analyse du confit intra-source (i.e. le conflit inhérent à la source) ou du conflit inter-sources (i.e. le conflit qui apparait lors de la fusion des sources). Nous illustrons sur des exemples jouets comment l’observation de la répartition du conflit par rapport aux différentes hypothèses peut permettre l’identification de l’origine de certains conflits. Trois applications de notre mesure sont ensuite développées, afin d’illustrer son utilité.La première application concerne la détection préventive de chute un véhicule type bicycle (moto). Les sources de données sont les accélérations mesurées sur les deux roues. Un conflit entre ces mesures, supposées hautement redondantes, voire corrélées, sera alors interprété comme un début de chute. Nous montrons que la décomposition du conflit fournit un indicateur de chute plus fin et précoce que la mesure du conflit Dempsterien.La deuxième application concerne la localisation de véhicule, problème essentiel pour l’autonomie des véhicules d'exploration comme des robots de service. Les sources sont des sorties d’algorithmes d’estimation de mouvement du véhicule. Nous montrons d’abord qu’estimer dynamiquement la fiabilité des sources permet d’améliorer la fusion. Nous montrons ensuite que la décomposition du conflit permet une mesure plus fine de la fiabilité de la fusion que la mesure du conflit Dempsterien. En cas de conflit détecté, l’estimation de la fiabilité de chaque source est ensuite fondée sur la vérification (ou non) d’une hypothèse de régularité temporelle, vérification elle-même basée sur une mesure de distance locale aux hypothèses simples ou composées. La troisième application propose une généralisation de la combinaison hybride de Dubois Prade au cas de la combinaison à N sources. Notre mesure calculant le conflit partiel associé à chaque sous-ensemble d’hypothèses, en nous inspirant du principe de la règle de combinaison hybride, nous redistribuons la masse de ce conflit partiel à la disjonction des hypothèses du sous-ensemble. La décomposition du conflit permet d’identifier de manière unique les différents sous-ensembles d’hypothèses contribuant au conflit.En conclusion, les travaux ont montré que l’information issue de la mesure du conflit, et de sa décomposition, pouvait (devait) être considérée comme une information à part entière, permettant notamment la gestion des sources et des croyances à fusionner. / Complex systems are now integrating many sensors, physical or logical, in order to be able to take the best decision knowing the exosystem and endosystem. These sensors are data sources, which deliver partial information, imprecise and/or uncertain, partially complementary and partially redundant. The theory of belief functions is now widely used in data fusion because it provides a formal framework for the representation of both imprecise and uncertainty information. However, even modeling the ignorance and the imprecision of the sources, the source combination usually lets appear some disagreement/conflict between sources.A disagreement between sources makes the system unstable and can impact the decision. Thus, for managing the disagreement, several authors have developed different combination rules where the “Dempster's conflict” is transferred to a set of elements. A few works have proposed to consider the conflict as a piece of information exploitable beyond the scope of the combination. In this work, we aim at decomposing the Dempster's conflict in order to better interpret it. We propose a decomposition with respect to different assumptions simple or compound of discernment space. We show the uniqueness of this decomposition and we specify the algorithm, based on the canonical decomposition of belief function. We then interpret each term of the decomposition as the contribution, to global conflict, brought by each hypothesis simple or compound. This decomposition is applied to the analysis of intra-conflict source (i.e. the conflict inherent in the source) or inter-conflict sources (i.e. the conflict appearing during the fusion of sources). We illustrate on toy examples how observing the distribution of conflict with respect to different assumptions may allow the identification of the origin of some conflicts.Three applications of our measurement have been developed to illustrate its usefulness.The first application deals with the preventive detection of fall for motorbike. Typical data sources are speed and accelerations measured on each of the two wheels. A conflict between these measures, supposed highly redundant or even correlated, should be interpreted as an early fall (sliding, shock). We show that the decomposition of conflict provides a finer and earlier indicator of fall than Dempster's conflict.The second application is the localization of the vehicle, the key issue being for autonomous exploration vehicles such as service robots. The sources are outputs of algorithms estimating the movement of the vehicle (such as odometers, visual odometry, FastSLAM). We first show that estimating the reliability of sources dynamically improves fusion. We then show that the decomposition of conflict allows a more refined measure of the fusion reliability than Dempster's conflict. Now, when conflict is detected, the estimation of the reliability of each source is based on the verification (or not) of an assumption of temporal regularity, verification itself based on a distance measure local to the discernment space hypotheses.The third application is the generalization of the hybrid combination [Dubois and Prade, 1988] to the case of N sources. Our measure calculates the partial conflicts associated with each subset of hypotheses. Following the hybrid combination [Dubois and Prade, 1988] principle, we redistribute the mass associated to a partial conflict on the disjunction of the hypotheses involving this partial conflict. In this redistribution, our decomposition of the conflict is essential since it allows identifying uniquely the various sub-sets of hypotheses involving partial conflicts.In conclusion, this work has shown that the information derived from the conflict measurement, and its decomposition could (should) be considered a full information, particularly for the management of sources and beliefs to combine.
147

Image registration and super-resolution mosaicing

Ye, Getian, Information Technology & Electrical Engineering, Australian Defence Force Academy, UNSW January 2005 (has links)
This thesis presents new approaches to image registration and super-resolution mosaicing as well as their applications. Firstly, a feature-based image registration method is proposed for a multisensor surveillance system that consists of an optical camera and an infrared camera. By integrating a non-rigid object tracking technique into this method, a novel approach to simultaneous object tracking and multisensor image registration is proposed. Based on the registration and fusion of multisensor information, automatic face detection is greatly improved. Secondly, some extensions of a gradient-based image registration method, called inverse compositional algorithm, are proposed. These extensions include cumulative multi-image registration and the incorporation of illumination change and lens distortion correction. They are incorporated into the framework of the original algorithm in a consistent manner and efficiency can still be achieved for multi-image registration with illumination and lens distortion correction. Thirdly, new super-resolution mosaicing algorithms are proposed for multiple uncompressed and compressed images. Considering the process of image formation, observation models are introduced to describe the relationship between the superresolution mosaic image and the uncompressed and compressed low-resolution images. To improve the performance of super-resolution mosaicing, a wavelet-based image interpolation technique and an approach to adaptive determination of the regularization parameter are presented. For compressed images, a spatial-domain algorithm and a transform-domain algorithm are proposed. All the proposed superresolution mosaicing algorithms are robust against outliers. They can produce superresolution mosaics and reconstructed super-resolution images with improved subjective quality. Finally, new techniques for super-resolution sprite generation and super-resolution sprite coding are proposed. Considering both short-term and long-term motion influences, an object-based image registration method is proposed for handling long image sequences. In order to remove the influence of outliers, a robust technique for super-resolution sprite generation is presented. This technique produces sprite images and reconstructed super-resolution images with high visual quality. Moreover, it provides better reconstructed low-resolution images compared with low-resolution sprite generation techniques. Due to the advantages of the super-resolution sprite, a super-resolution sprite coding technique is also proposed. It achieves high coding efficiency especially at a low bit-rate and produces both decoded low-resolution and super-resolution images with improved subjective quality. Throughout this work, the performance of all the proposed algorithms is evaluated using both synthetic and real image sequences.
148

Resource management for data streaming applications

Agarwalla, Bikash Kumar 07 July 2010 (has links)
This dissertation investigates novel middleware mechanisms for building streaming applications. Developing streaming applications is a challenging task because (i) they are continuous in nature; (ii) they require fusion of data coming from multiple sources to derive higher level information; (iii) they require efficient transport of data from/to distributed sources and sinks; (iv) they need access to heterogeneous resources spanning sensor networks and high performance computing; and (v) they are time critical in nature. My thesis is that an intuitive programming abstraction will make it easier to build dynamic, distributed, and ubiquitous data streaming applications. Moreover, such an abstraction will enable an efficient allocation of shared and heterogeneous computational resources thereby making it easier for domain experts to build these applications. In support of the thesis, I present a novel programming abstraction, called DFuse, that makes it easier to develop these applications. A domain expert only needs to specify the input and output connections to fusion channels, and the fusion functions. The subsystems developed in this dissertation take care of instantiating the application, allocating resources for the application (via the scheduling heuristic developed in this dissertation) and dynamically managing the resources (via the dynamic scheduling algorithm presented in this dissertation). Through extensive performance evaluation, I demonstrate that the resources are allocated efficiently to optimize the throughput and latency constraints of an application.
149

Bayesian 3D multiple people tracking using multiple indoor cameras and microphones

Lee, Yeongseon. January 2009 (has links)
Thesis (Ph.D)--Electrical and Computer Engineering, Georgia Institute of Technology, 2009. / Committee Chair: Rusell M. Mersereau; Committee Member: Biing Hwang (Fred) Juang; Committee Member: Christopher E. Heil; Committee Member: Georgia Vachtsevanos; Committee Member: James H. McClellan. Part of the SMARTech Electronic Thesis and Dissertation Collection.
150

Infrastructure mediated sensing

Patel, Shwetak Naran 08 July 2008 (has links)
Ubiquitous computing application developers have limited options for a practical activity and location sensing technology that is easy-to-deploy and cost-effective. In this dissertation, I have developed a class of activity monitoring systems called infrastructure mediated sensing (IMS), which provides a whole-house solution for sensing activity and the location of people and objects. Infrastructure mediated sensing leverages existing home infrastructure (e.g, electrical systems, air conditioning systems, etc.) to mediate the transduction of events. In these systems, infrastructure activity is used as a proxy for a human activity involving the infrastructure. A primary goal of this type of system is to reduce economic, aesthetic, installation, and maintenance barriers to adoption by reducing the cost and complexity of deploying and maintaining the activity sensing hardware. I discuss the design, development, and applications of various IMS-based activity and location sensing technologies that leverage the following existing infrastructures: wireless Bluetooth signals, power lines, and central heating, ventilation, and air conditioning (HVAC) systems. In addition, I show how these technologies facilitate automatic and unobtrusive sensing and data collection for researchers or application developers interested in conducting large-scale in-situ location-based studies in the home.

Page generated in 0.0548 seconds