• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 29
  • 5
  • 2
  • 2
  • 1
  • Tagged with
  • 47
  • 47
  • 14
  • 14
  • 11
  • 10
  • 9
  • 9
  • 9
  • 9
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Méthodes séquentielles de Monte Carlo pour le suivi d'objets multiples hétérogènes en données brutes de télémétrie laser / Sequential Monte Carlo methods for tracking heterogeneous multiple objects in raw data of laser telemetry

Vanpoperinghe, Élodie 27 January 2014 (has links)
Dans cette thèse, nous nous intéressons à la résolution de problèmes de détection et de suivi d'objets mobiles multiples sur route, à partir de données télémétrique de type lidar à balayage. Les travaux dans le domaine de la détection et de suivi d'obstacles à partir de données lidar mettent généralement en oeure trois principales étapes : la détection, l'association de mesures et le filtrage. Cependant, il est connu que cette chaîne de traitement peut engendrer des pertes d'informations pouvant être à l'origine de cas de non détection ou de fausse alarme. Par ailleurs, les non-linéarités liées à la transformation polaire-cartésien des mesures lidar au cours de l'étape de détection ne permettent plus de préserver la statistique des bruits de mesure après traitement. Une autre difficulté, compte tenu de la nature spatialement distribuée des mesures lidar liées à un objet, est de pouvoir associer chaque impact à un unique véhicule tout en prenant en compte la variabilité temporelle du nombre d'impacts à lui associer. Seule une approche exploitant les données brutes permet de garantir l'optimalité de la chaîne de traitement. Cette thèse explore une nouvelle approche conjointe de détection et de suivi exploitant les données brutes du lidar, éliminant toute étape de pré-détection. L'approche proposé repose, d'une part, sur l'usage des méthodes de Monte Carlo séquentielles en raison de leur aptitude à traiter des modèles fortement non linéaire, , et, d'autre part, sur une modélisation des ojets compatible avec la perception lidar. La méthode est validée avec des données du simulateur SIVIC dans différentes situations expérimentales pour la détection et le suivi d'objets hétérogènes dans un cas lidar monoplan puis multiplan. / This thesis focus on the problem of multiobject detection and tracking multiple moving objects on the road, using a scanning laser rangefinder. The works in the field of obstacle detection and tracking from lidar data generally use three main stages : detection, measurement association and filtering. However, it is known that this processing chain can lead to a loss of information that may be reponsible for non-detection or false alarm problems. Furthermore, the non-linearities associated to the polar-to-Cartesian transformation of lidar measurements during the detection step cannot preserve the statistical properties of the measurement noise. Another difficulty, related to the spatially distributed nature of a lidar measurements of an object, is to associate each impact with a single vehicle while taking into account the temporal variability of the number of impacts. An approach that only exploits the raw data ensures the optimality of the processing chain. This thesis explores a new joint approach for detection and tracking that uses raw lidar data, while eliminating any step of predetection. The proposed approach is based, first, on the use of sequential Monte Carlo methods due to their ability to deal with highly non-linear models, and secondly, on an object modeling related to lidar measure. The method is validated with data from the simulator SIVIC under different experimental conditions for the detection and tracking of heterogeneous objects with monolayer and multilayer lidar.
12

Resampling in particle filters

Hol, Jeroen D. January 2004 (has links)
<p>In this report a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms based on resampling quality and on computational complexity. Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in resampling quality and computational complexity.</p>
13

Computer vision based sensors for chemical processes

Jampana, Phanindra varma 06 1900 (has links)
The main area of research delineated in this thesis provides instances when Computer vision based technology has shown tremendous productivity gains in the Oil sands industry in Fort McMurray, Alberta, Canada. Specifically, the interface between Bitumen-froth (crude oil) and the Middlings (Sand) in separation cells (during the extraction process) is estimated in real time from camera video and used for automatic control of the interface level. Two original algorithms have been developed which solve the interface estimation problem using techniques ranging from image analysis, estimation theory (Particle filters) and probabilistic reasoning. These ideas are discussed in chapters three and four. The first chapter of this thesis discusses the broad area of Computer vision research as a knowledge basis for the current work. Computer vision (automatic image analysis) has been presented starting from the basics and culminating in advanced algorithms that are used frequently. The methods described in this chapter form the foundation of the work that follows in the subsequent chapters. After the introduction to automatic image analysis, a set of Monte Carlo simulation based methods called Particle filters are introduced in the second chapter. These Monte Carlo filters assume importance in the current work as they are used to derive one of the main results of this thesis. A large part of this chapter though is devoted to the introduction of the concept of measure theoretic probability which is used in proving the convergence of Particle filters. Another application of Computer vision techniques is also developed in this thesis (in chapter five) to treat the problem of automatic interface and boundary detection in X-ray view cell images. These images are typically used to observe liquid-liquid and liquid-vapour phase behaviour of heavy oils such as Bitumen in chemical equilibrium investigations. The equilibrium data would then be used to enhance Bitumen separation technologies. Manual tracking of the interfaces between these phases for different mixtures and conditions is time consuming when a large set of such images are to be analysed. A novel algorithm is developed that is based on state-of-the-art in Computer vision techniques and automates the entire task. / Process Control
14

Resampling in particle filters

Hol, Jeroen D. January 2004 (has links)
In this report a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms based on resampling quality and on computational complexity. Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in resampling quality and computational complexity.
15

A Bayesian Framework for Target Tracking using Acoustic and Image Measurements

Cevher, Volkan 18 January 2005 (has links)
Target tracking is a broad subject area extensively studied in many engineering disciplines. In this thesis, target tracking implies the temporal estimation of target features such as the target's direction-of-arrival (DOA), the target's boundary pixels in a sequence of images, and/or the target's position in space. For multiple target tracking, we have introduced a new motion model that incorporates an acceleration component along the heading direction of the target. We have also shown that the target motion parameters can be considered part of a more general feature set for target tracking, e.g., target frequencies, which may be unrelated to the target motion, can be used to improve the tracking performance. We have introduced an acoustic multiple-target tracker using a flexible observation model based on an image tracking approach by assuming that the DOA observations might be spurious and that some of the DOAs might be missing in the observation set. We have also addressed the acoustic calibration problem from sources of opportunity such as beacons or a moving source. We have derived and compared several calibration methods for the case where the node can hear a moving source whose position can be reported back to the node. The particle filter, as a recursive algorithm, requires an initialization phase prior to tracking a state vector. The Metropolis-Hastings (MH) algorithm has been used for sampling from intractable multivariate target distributions and is well suited for the initialization problem. Since the particle filter only needs samples around the mode, we have modified the MH algorithm to generate samples distributed around the modes of the target posterior. By simulations, we show that this mode hungry algorithm converges an order of magnitude faster than the original MH scheme. Finally, we have developed a general framework for the joint state-space tracking problem. A proposal strategy for joint state-space tracking using the particle filters is defined by carefully placing the random support of the joint filter in the region where the final posterior is likely to lie. Computer simulations demonstrate improved performance and robustness of the joint state-space when using the new particle proposal strategy.
16

Probability Hypothesis Densities for Multitarget, Multisensor Tracking with Application to Passive Radar

Tobias, Martin 07 April 2006 (has links)
The probability hypothesis density (PHD), popularized by Ronald Mahler, presents a novel and theoretically-rigorous approach to multitarget, multisensor tracking. Based on random set theory, the PHD is the first moment of a point process of a random track set, and it can be propagated by Bayesian prediction and observation equations to form a multitarget, multisensor tracking filter. The advantage of the PHD filter lies in its ability to estimate automatically the expected number of targets present, to fuse easily different kinds of data observations, and to locate targets without performing any explicit report-to-track association. We apply a particle-filter implementation of the PHD filter to realistic multitarget, multisensor tracking using passive coherent location (PCL) systems that exploit illuminators of opportunity such as FM radio stations. The objective of this dissertation is to enhance the usefulness of the PHD particle filter for multitarget, multisensor tracking, in general, and within the context of PCL, in particular. This involves a number of thrusts, including: 1) devising intelligent proposal densities for particle placement, 2) devising a peak-extraction algorithm for extracting information from the PHD, 3) incorporating realistic probabilities of detection and signal-to-noise ratios (including multipath effects) to model realistic PCL scenarios, 4) using range, Doppler, and direction of arrival (DOA) observations to test the target detection and data fusion capabilities of the PHD filter, and 5) clarifying the concepts behind FISST and the PHD to make them more accessible to the practicing engineer. A goal of this dissertation is to serve as a tutorial for anyone interested in becoming familiar with the probability hypothesis density and associated PHD particle filter. It is hoped that, after reading this thesis, the reader will have gained a clearer understanding of the PHD and the functionality and effectiveness of the PHD particle filter.
17

Particle Filter Tracking Architecture for use Onboard Unmanned Aerial Vehicles

Ludington, Ben T. 14 November 2006 (has links)
Unmanned Aerial Vehicles (UAVs) are capable of placing sensors at unique vantage points without endangering a pilot. Therefore, they are well suited to perform target tracking missions. However, performing the mission can be burdensome for the operator. To track a target, the operator must estimate the position of the target from the incoming video stream, update the orientation of the camera, and move the vehicle to an appropriate vantage point. The purpose of the research in this thesis is to provide a target tracking system that performs these tasks automatically in real-time. The first task, which receives the majority of the attention, is estimating the position of the target within the incoming video stream. Because of the inherent clutter in the imagery, the resulting probability distributions are typically non-Gaussian and multi-modal. Therefore, classical state estimation techniques, such as the Kalman filter and its variants are unacceptable solutions. The particle filter has become a popular alternative since it is able to approximate the multi-modal distributions using a set of samples, and it is used as part of this research. To improve the performance of the filter and manage the inherently large computational burden a neural network is used to estimate the performance of the particle filter. The filter parameters are then changed in response to the performance. Once the position of the target is estimated in the frame, it is projected on the ground using the camera orientation and vehicle attitude and input into a linear predictor. The output of the predictor is used to update the orientation of the camera and vehicle waypoints. Through offline, simulation, and flight testing, the approach is shown to provide a powerful visual tracking system for use onboard the GTMax unmanned research helicopter.
18

A Particle Filtering-based Framework for On-line Fault Diagnosis and Failure Prognosis

Orchard, Marcos Eduardo 08 November 2007 (has links)
This thesis presents an on-line particle-filtering-based framework for fault diagnosis and failure prognosis in nonlinear, non-Gaussian systems. The methodology assumes the definition of a set of fault indicators, which are appropriate for monitoring purposes, the availability of real-time process measurements, and the existence of empirical knowledge (or historical data) to characterize both nominal and abnormal operating conditions. The incorporation of particle-filtering (PF) techniques in the proposed scheme not only allows for the implementation of real time algorithms, but also provides a solid theoretical framework to handle the problem of fault detection and isolation (FDI), fault identification, and failure prognosis. Founded on the concept of sequential importance sampling (SIS) and Bayesian theory, PF approximates the conditional state probability distribution by a swarm of points called particles and a set of weights representing discrete probability masses. Particles can be easily generated and recursively updated in real time, given a nonlinear process dynamic model and a measurement model that relates the states of the system with the observed fault indicators. Two autonomous modules have been considered in this research. On one hand, the fault diagnosis module uses a hybrid state-space model of the plant and a particle-filtering algorithm to (1) calculate the probability of any given fault condition in real time, (2) estimate the probability density function (pdf) of the continuous-valued states in the monitored system, and (3) provide information about type I and type II detection errors, as well as other critical statistics. Among the advantages offered by this diagnosis approach is the fact that the pdf state estimate may be used as the initial condition in prognostic modules after a particular fault mode is isolated, hence allowing swift transitions between FDI and prognostic routines. The failure prognosis module, on the other hand, computes (in real time) the pdf of the remaining useful life (RUL) of the faulty subsystem using a particle-filtering-based algorithm. This algorithm consecutively updates the current state estimate for a nonlinear state-space model (with unknown time-varying parameters) and predicts the evolution in time of the fault indicator pdf. The outcome of the prognosis module provides information about the precision and accuracy of long-term predictions, RUL expectations, 95% confidence intervals, and other hypothesis tests for the failure condition under study. Finally, inner and outer correction loops (learning schemes) are used to periodically improve the parameters that characterize the performance of FDI and/or prognosis algorithms. Illustrative theoretical examples and data from a seeded fault test for a UH-60 planetary carrier plate are used to validate all proposed approaches. Contributions of this research include: (1) the establishment of a general methodology for real time FDI and failure prognosis in nonlinear processes with unknown model parameters, (2) the definition of appropriate procedures to generate dependable statistics about fault conditions, and (3) a description of specific ways to utilize information from real time measurements to improve the precision and accuracy of the predictions for the state probability density function (pdf).
19

Computer vision based sensors for chemical processes

Jampana, Phanindra varma Unknown Date
No description available.
20

Distributed Particle Filters for Data Assimilation in Simulation of Large Scale Spatial Temporal Systems

Bai, Fan 18 December 2014 (has links)
Assimilating real time sensor into a running simulation model can improve simulation results for simulating large-scale spatial temporal systems such as wildfire, road traffic and flood. Particle filters are important methods to support data assimilation. While particle filters can work effectively with sophisticated simulation models, they have high computation cost due to the large number of particles needed in order to converge to the true system state. This is especially true for large-scale spatial temporal simulation systems that have high dimensional state space and high computation cost by themselves. To address the performance issue of particle filter-based data assimilation, this dissertation developed distributed particle filters and applied them to large-scale spatial temporal systems. We first implemented a particle filter-based data assimilation framework and carried out data assimilation to estimate system state and model parameters based on an application of wildfire spread simulation. We then developed advanced particle routing methods in distributed particle filters to route particles among the Processing Units (PUs) after resampling in effective and efficient manners. In particular, for distributed particle filters with centralized resampling, we developed two routing policies named minimal transfer particle routing policy and maximal balance particle routing policy. For distributed PF with decentralized resampling, we developed a hybrid particle routing approach that combines the global routing with the local routing to take advantage of both. The developed routing policies are evaluated from the aspects of communication cost and data assimilation accuracy based on the application of data assimilation for large-scale wildfire spread simulations. Moreover, as cloud computing is gaining more and more popularity; we developed a parallel and distributed particle filter based on Hadoop & MapReduce to support large-scale data assimilation.

Page generated in 0.0522 seconds