• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 1
  • 1
  • 1
  • Tagged with
  • 18
  • 18
  • 9
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Performance Optimization Of Monopulse Tracking Radar

Sahin, Mehmet Alper 01 August 2004 (has links) (PDF)
An analysis and simulation tool is developed for optimizing system parameters of the monopulse target tracking radar and observing effects of the system parameters on the performance of the system over different scenarios. A monopulse tracking radar is modeled for measuring the performance of the radar with given parameters, during the thesis studies. The radar model simulates the operation of a Class IA type monopulse automatic tracking radar, which uses a planar phased array. The interacting multiple model (IMM) estimator with the Probabilistic Data Association (PDA) technique is used as the tracking filter. In addition to modeling of the tracking radar model, an optimization tool is developed to optimize system parameters of this tracking radar model. The optimization tool implements a Genetic Algorithm (GA) belonging to a GA Toolbox distributed by Department of Automatic Control and System Engineering at University of Sheffield. The thesis presents optimization results over some given optimization scenarios and concludes on effect of tracking filter parameters, beamwidth and dwell interval for the confirmed track.
12

A framework for processing correlated probabilistic data

van Schaik, Sebastiaan Johannes January 2014 (has links)
The amount of digitally-born data has surged in recent years. In many scenarios, this data is inherently uncertain (or: probabilistic), such as data originating from sensor networks, image and voice recognition, location detection, and automated web data extraction. Probabilistic data requires novel and different approaches to data mining and analysis, which explicitly account for the uncertainty and the correlations therein. This thesis introduces ENFrame: a framework for processing and mining correlated probabilistic data. Using this framework, it is possible to express both traditional and novel algorithms for data analysis in a special user language, without having to explicitly address the uncertainty of the data on which the algorithms operate. The framework will subsequently execute the algorithm on the probabilistic input, and perform exact or approximate parallel probability computation. During the probability computation, correlations and provenance are succinctly encoded using probabilistic events. This thesis contains novel contributions in several directions. An expressive user language – a subset of Python – is introduced, which allows a programmer to implement algorithms for probabilistic data without requiring knowledge of the underlying probabilistic model. Furthermore, an event language is presented, which is used for the probabilistic interpretation of the user program. The event language can succinctly encode arbitrary correlations using events, which are the probabilistic counterparts of deterministic user program variables. These highly interconnected events are stored in an event network, a probabilistic interpretation of the original user program. Multiple techniques for exact and approximate probability computation (with error guarantees) of such event networks are presented, as well as techniques for parallel computation. Adaptations of multiple existing data mining algorithms are shown to work in the framework, and are subsequently subjected to an extensive experimental evaluation. Additionally, a use-case is presented in which a probabilistic adaptation of a clustering algorithm is used to predict faults in energy distribution networks. Lastly, this thesis presents techniques for integrating a number of different probabilistic data formalisms for use in this framework and in other applications.
13

Sledování pohybu objektů v obrazovém signálu / Tracking the movement of objects in the video signal

Šidó, Balázs January 2017 (has links)
Tato diplomova prace se zameruje na sledovani pohybu vice objektu. Prace popisuje dve implementace filtru, ktere jsou v podstate zalozeny na principu Kalmanova filtru. Obe implementace jsou zalozeny na principu sledovani vice objektu, na zaklade znalosti pozic vsech objektu v kazdem snimku. Prvni implementace je smisena verze Globalniho a Standardniho filtru nejblizsich sousedu. Druha implementace je postavena na pravde- podobnostnim pristupu k procesu sdruzeni. Posledni kapitola poskytuje srovnani mezi temito filtry a Zakladnim filtrem. Algoritmy byly realizovany v jave.
14

In Situ Summarization and Visual Exploration of Large-scale Simulation Data Sets

Dutta, Soumya 17 September 2018 (has links)
No description available.
15

Agrégation de ressources avec contrainte de distance : applications aux plateformes de grande échelle / Resource clustering with distance constraint : applications to large scale platforms

Larchevêque, Hubert 27 September 2010 (has links)
Durant cette thèse, nous avons introduit les problèmes de Bin Covering avec Contrainte de Distance (BCCD) et de Bin Packing avec Contrainte de Distance (BPCD), qui trouvent leur application dans les réseaux de grande échelle, tel Internet. L'étude de ces problèmes que nous effectuons dans des espaces métriques quelconques montre qu'il est impossible de travailler dans un tel cadre sans avoir recours à de l'augmentation de ressources, un procédé qui permet d'élaborer des algorithmes construisant des solutions moins contraintes que la solution optimale à laquelle elles sont comparées. En plus de résultats d'approximation intéressants, nous prouvons la difficulté de ces problèmes si ce procédé n'est pas utilisé. Par ailleurs, de nombreux outils ont pour objectif de plonger les grands réseaux qui nous intéressent dans des espaces métriques bien décrits. Nous avons alors étudié nos problèmes dans plusieurs espaces métriques spécifiques, et, en particulier, ceux générés par certains de ces outils, comme Vivaldi et Sequoia. / During this Ph.D we introduced Bin Covering under Distance Constraint (BCCD in French) and Bin Packing under Distance Constraint (BPCD in French). Those two problems find their applications in the context of large scale networks, like Internet. We studied those problems in general metric spaces, and proved that using resource augmentation is mandatory. Resource augmentation allows to build algorithms working on solutions with less constraints than the optimal solution to which it is compared to. We found interesting approximations algorithms using this relaxation, and proved the necessity of this resource augmentation. However many tools are used to embed large networks we are interested in in specific metric spaces. Thus we studied those problems in different specific metric spaces, in particular those generated by the use of Vivaldi and Sequoia, two of those tools.
16

Tracker-aware Detection: A Theoretical And An Experimental Study

Aslan, Murat Samil 01 February 2009 (has links) (PDF)
A promising line of research attempts to bridge the gap between detector and tracker by means of considering jointly optimal parameter settings for both of these subsystems. Along this fruitful path, this thesis study focuses on the problem of detection threshold optimization in a tracker-aware manner so that a feedback from the tracker to the detector is established to maximize the overall system performance. Special emphasis is given to the optimization schemes based on two non-simulation performance prediction (NSPP) methodologies for the probabilistic data association filter (PDAF), namely, the modified Riccati equation (MRE) and the hybrid conditional averaging (HYCA) algorithm. The possible improvements are presented in two domains: Non-maneuvering and maneuvering target tracking. In the first domain, a number of algorithmic and experimental evaluation gaps are identified and newly proposed methods are compared with the existing ones in a unified theoretical and experimental framework. Furthermore, for the MRE based dynamic threshold optimization problem, a closed-form solution is proposed. This solution brings a theoretical lower bound on the operating signal-to-noise ratio (SNR) concerning when the tracking system should be switched to the track before detect (TBD) mode. As the improvements of the second domain, some of the ideas used in the first domain are extended to the maneuvering target tracking case. The primary contribution is made by extending the dynamic optimization schemes applicable to the PDAF to the interacting multiple model probabilistic data association filter (IMM-PDAF). Resulting in an online feedback from the filter to the detector, this extension makes the tracking system robust against track losses under low SNR values.
17

On relations between classical and quantum theories of information and probability

Nyman, Peter January 2011 (has links)
In this thesis we study quantum-like representation and simulation of quantum algorithms by using classical computers.The quantum--like representation algorithm (QLRA) was  introduced by A. Khrennikov (1997) to solve the ``inverse Born's rule problem'', i.e. to construct a representation of probabilistic data-- measured in any context of science-- and represent this data by a complex or more general probability amplitude which matches a generalization of Born's rule.The outcome from QLRA matches the formula of total probability with an additional trigonometric, hyperbolic or hyper-trigonometric interference term and this is in fact a generalization of the familiar formula of interference of probabilities. We study representation of statistical data (of any origin) by a probability amplitude in a complex algebra and a Clifford algebra (algebra of hyperbolic numbers). The statistical data is collected from measurements of two dichotomous and trichotomous observables respectively. We see that only special statistical data (satisfying a number of nonlinear constraints) have a quantum--like representation. We also study simulations of quantum computers on classical computers.Although it can not be denied that great progress have been made in quantum technologies, it is clear that there is still a huge gap between the creation of experimental quantum computers and realization of a quantum computer that can be used in applications. Therefore the simulation of quantum computations on classical computers became an important part in the attempt to cover this gap between the theoretical mathematical formulation of quantum mechanics and the realization of quantum computers. Of course, it can not be expected that quantum algorithms would help to solve NP problems for polynomial time on classical computers. However, this is not at all the aim of classical simulation.  The second part of this thesis is devoted to adaptation of the Mathematica symbolic language to known quantum algorithms and corresponding simulations on classical computers. Concretely we represent Simon's algorithm, Deutsch-Josza algorithm, Shor's algorithm, Grover's algorithm and quantum error-correcting codes in the Mathematica symbolic language. We see that the same framework can be used for all these algorithms. This framework will contain the characteristic property of the symbolic language representation of quantum computing and it will be a straightforward matter to include future algorithms in this framework.
18

Cartographie dense basée sur une représentation compacte RGB-D dédiée à la navigation autonome / A compact RGB-D map representation dedicated to autonomous navigation

Gokhool, Tawsif Ahmad Hussein 05 June 2015 (has links)
Dans ce travail, nous proposons une représentation efficace de l’environnement adaptée à la problématique de la navigation autonome. Cette représentation topométrique est constituée d’un graphe de sphères de vision augmentées d’informations de profondeur. Localement la sphère de vision augmentée constitue une représentation égocentrée complète de l’environnement proche. Le graphe de sphères permet de couvrir un environnement de grande taille et d’en assurer la représentation. Les "poses" à 6 degrés de liberté calculées entre sphères sont facilement exploitables par des tâches de navigation en temps réel. Dans cette thèse, les problématiques suivantes ont été considérées : Comment intégrer des informations géométriques et photométriques dans une approche d’odométrie visuelle robuste ; comment déterminer le nombre et le placement des sphères augmentées pour représenter un environnement de façon complète ; comment modéliser les incertitudes pour fusionner les observations dans le but d’augmenter la précision de la représentation ; comment utiliser des cartes de saillances pour augmenter la précision et la stabilité du processus d’odométrie visuelle. / Our aim is concentrated around building ego-centric topometric maps represented as a graph of keyframe nodes which can be efficiently used by autonomous agents. The keyframe nodes which combines a spherical image and a depth map (augmented visual sphere) synthesises information collected in a local area of space by an embedded acquisition system. The representation of the global environment consists of a collection of augmented visual spheres that provide the necessary coverage of an operational area. A "pose" graph that links these spheres together in six degrees of freedom, also defines the domain potentially exploitable for navigation tasks in real time. As part of this research, an approach to map-based representation has been proposed by considering the following issues : how to robustly apply visual odometry by making the most of both photometric and ; geometric information available from our augmented spherical database ; how to determine the quantity and optimal placement of these augmented spheres to cover an environment completely ; how tomodel sensor uncertainties and update the dense infomation of the augmented spheres ; how to compactly represent the information contained in the augmented sphere to ensure robustness, accuracy and stability along an explored trajectory by making use of saliency maps.

Page generated in 0.062 seconds