• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 878
  • 201
  • 126
  • 110
  • 73
  • 25
  • 17
  • 16
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 1729
  • 412
  • 311
  • 245
  • 228
  • 184
  • 174
  • 167
  • 166
  • 156
  • 155
  • 152
  • 152
  • 150
  • 141
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
581

A feature based face tracker using extended Kalman filtering

Ingemars, Nils January 2007 (has links)
<p>A face tracker is exactly what it sounds like. It tracks a face in a video sequence. Depending on the complexity of the tracker, it could track the face as a rigid object or as a complete deformable face model with face expressions.</p><p>This report is based on the work of a real time feature based face tracker. Feature based means that you track certain features in the face, like points with special characteristics. It might be a mouth or eye corner, but theoretically it could be any point. For this tracker, the latter is of interest. Its task is to extract global parameters, i.e. rotation and translation, as well as dynamic facial parameters (expressions) for each frame. It tracks feature points using motion between frames and a textured face model (Candide). It then uses an extended Kalman filter to estimate the parameters from the tracked feature points.</p>
582

Multiobject tracking by adaptive hypothesis testing

January 1979 (has links)
by Kenneth M. Keverian, Nils R. Sandell, Jr. / Office of Naval Research Contract ONR/N00014-77-C-0532 (85552). / Originally presented as the first author's thesis, (B.S.) in the M.I.T. Dept. of Electrical Engineering and Computer Science, 1979. / Bibliography: p. 114-115.
583

Flash Photography Enhancement via Intrinsic Relighting

Eisemann, Elmar, Durand, Frédo 01 1900 (has links)
We enhance photographs shot in dark environments by combining a picture taken with the available light and one taken with the flash. We preserve the ambiance of the original lighting and insert the sharpness from the flash image. We use the bilateral filter to decompose the images into detail and large scale. We reconstruct the image using the large scale of the available lighting and the detail of the flash. We detect and correct flash shadows. This combines the advantages of available illumination and flash photography. / Singapore-MIT Alliance (SMA)
584

Sigma-Point Kalman Filters for Probabilistic Inference in Dynamic State-Space Models

Van der Merwe, Rudolph 04 1900 (has links) (PDF)
Ph.D. / Electrical and Computer Engineering / Probabilistic inference is the problem of estimating the hidden variables (states or parameters) of a system in an optimal and consistent fashion as a set of noisy or incomplete observations of the system becomes available online. The optimal solution to this problem is given by the recursive Bayesian estimation algorithm which recursively updates the posterior density of the system state as new observations arrive. This posterior density constitutes the complete solution to the probabilistic inference problem, and allows us to calculate any "optimal" estimate of the state. Unfortunately, for most real-world problems, the optimal Bayesian recursion is intractable and approximate solutions must be used. Within the space of approximate solutions, the extended Kalman filter (EKF) has become one of the most widely used algorithms with applications in state, parameter and dual estimation. Unfortunately, the EKF is based on a sub-optimal implementation of the recursive Bayesian estimation framework applied to Gaussian random variables. This can seriously affect the accuracy or even lead to divergence of any inference system that is based on the EKF or that uses the EKF as a component part. Recently a number of related novel, more accurate and theoretically better motivated algorithmic alternatives to the EKF have surfaced in the literature, with specific application to state estimation for automatic control. We have extended these algorithms, all based on derivativeless deterministic sampling based approximations of the relevant Gaussian statistics, to a family of algorithms called Sigma-Point Kalman Filters (SPKF). Furthermore, we successfully expanded the use of this group of algorithms (SPKFs) within the general field of probabilistic inference and machine learning, both as stand-alone filters and as subcomponents of more powerful sequential Monte Carlo methods (particle filters). We have consistently shown that there are large performance benefits to be gained by applying Sigma-Point Kalman filters to areas where EKFs have been used as the de facto standard in the past, as well as in new areas where the use of the EKF is impossible.
585

Multi-scale texture analysis of remote sensing images using gabor filter banks and wavelet transforms

Ravikumar, Rahul 15 May 2009 (has links)
Traditional remote sensing image classification has primarily relied on image spectral information and texture information was ignored or not fully utilized. Existing remote sensing software packages have very limited functionalities with respect to texture information extraction and utilization. This research focuses on the use of multi-scale image texture analysis techniques using Gabor filter banks and Wavelet transformations. Gabor filter banks model texture as irradiance patterns in an image over a limited range of spatial frequencies and orientations. Using Gabor filters, each image texture can be differentiated with respect to its dominant spatial frequency and orientation. Wavelet transformations are useful for decomposition of an image into a set of images based on an orthonormal basis. Dyadic transformations are applied to generate a multi-scale image pyramid which can be used for texture analysis. The analysis of texture is carried out using both artificial textures and remotely sensed image corresponding to natural scenes. This research has shown that texture can be extracted and incorporated in conventional classification algorithms to improve the accuracy of classified results. The applicability of Gabor filter banks and Wavelets is explored for classifying and segmenting remote sensing imagery for geographical applications. A qualitative and quantitative comparison between statistical texture indicators and multi-scale texture indicators has been performed. Multi-scale texture indicators derived from Gabor filter banks have been found to be very effective due to the nature of their configurability to target specific textural frequencies and orientations in an image. Wavelet transformations have been found to be effective tools in image texture analysis as they help identify the ideal scale at which texture indicators need to be measured and reduce the computation time taken to derive statistical texture indicators. A robust set of software tools for texture analysis has been developed using the popular .NET and ArcObjects. ArcObjects has been chosen as the API of choice, as these tools can be seamlessly integrated into ArcGIS. This will aid further exploration of image texture analysis by the remote sensing community.
586

Nonparametric Message Passing Methods for Cooperative Localization and Tracking

Savic, Vladimir January 2012 (has links)
The objective of this thesis is the development of cooperative localization and tracking algorithms using nonparametric message passing techniques. In contrast to the most well-known techniques, the goal is to estimate the posterior probability density function (PDF) of the position of each sensor. This problem can be solved using Bayesian approach, but it is intractable in general case. Nevertheless, the particle-based approximation (via nonparametric representation), and an appropriate factorization of the joint PDFs (using message passing methods), make Bayesian approach acceptable for inference in sensor networks. The well-known method for this problem, nonparametric belief propagation (NBP), can lead to inaccurate beliefs and possible non-convergence in loopy networks. Therefore, we propose four novel algorithms which alleviate these problems: nonparametric generalized belief propagation (NGBP) based on junction tree (NGBP-JT), NGBP based on pseudo-junction tree (NGBP-PJT), NBP based on spanning trees (NBP-ST), and uniformly-reweighted NBP (URW-NBP). We also extend NBP for cooperative localization in mobile networks. In contrast to the previous methods, we use an optional smoothing, provide a novel communication protocol, and increase the efficiency of the sampling techniques. Moreover, we propose novel algorithms for distributed tracking, in which the goal is to track the passive object which cannot locate itself. In particular, we develop distributed particle filtering (DPF) based on three asynchronous belief consensus (BC) algorithms: standard belief consensus (SBC), broadcast gossip (BG), and belief propagation (BP). Finally, the last part of this thesis includes the experimental analysis of some of the proposed algorithms, in which we found that the results based on real measurements are very similar with the results based on theoretical models.
587

Filtering service recovery feedback : A Case study research at Handelsbanken, Uppsala city

Nolan, Neil, Rudström, David January 2008 (has links)
Research has shown that companies encourage customers to complain and gather huge amounts of service recovery information, although most of this information isn´t used by the companies. Our purpose with this thesis is to explore what determines the filtering of service recovery feedback, and if possible to identify its underlying reasons. This was accomplished through a qualitative case study at Handelsbanken Uppsala City. Empirical material was mainly collected through interviews with the office manager, frontline employees, and the regional complaints manager. When analyzing the empirical material Tax and Brown model of service recovery was used as an analytical framework. The analysis shows that the employees at Handelsbanken Uppsala city aren’t controlled by many guidelines and policies; instead emphasis is put on the independence, trust, and responsibility of each individual employee. This is probably due to the decentralized organization of Handelsbanken and the belief in the employee’s capability to better understand what is of importance to filter, due to their close interaction with customers.
588

The comparison of item-based and trust-based CF in sparsity problems

Wu, Chun-yi 02 August 2007 (has links)
With the dramatic growth of the Internet, it is much easier for us to acquire information than before. It is, however, relatively difficult to extract desired information through the huge information pool. One method is to rely on the search engines by analyzing the queried keywords to locate the relevant information. The other one is to recommend users what they may be interested in via recommender systems that analyze the users¡¦ past preferences or other users with similar interests to lessen our information processing loadings. Typical recommendation techniques are classified into content-based filtering technique and collaborative filtering (CF) technique. Several research works in literature have indicated that the performance of collaborative filtering is superior to that of content-based filtering in that it is subject to neither the content format nor users¡¦ past experiences. The collaborative filtering technique, however, has its own limitation of the sparsity problem. To relieve such a problem, researchers proposed several CF-typed variants, including item-based CF and trust-based CF. Few works in literature, however, focus on their performance comparison. The objective of this research is thus to evaluate both approaches under different settings such as the sparsity degrees, data scales, and number of neighbors to make recommendations. We conducted two experiments to examine their performance. The results show that trust-based CF is generally better than item-based CF in sparsity problem. Their difference, however, becomes insignificant with the sparsity decreasing. In addition, the computational time for trust-based CF increases more quickly than that for item-based CF, even though both exhibit exponential growths. Finally, the optimal number of nearest neighbors in both approaches does not heavily depend on the data scale but displays steady robustness.
589

Data Filtering and Control Design for Mobile Robots

Karasalo, Maja January 2009 (has links)
In this thesis, we consider problems connected to navigation and tracking for autonomousrobots under the assumption of constraints on sensors and kinematics. We study formation controlas well as techniques for filtering and smoothing of noise contaminated input. The scientific contributions of the thesis comprise five papers.In Paper A, we propose three cascaded, stabilizing formation controls for multi-agent systems.We consider platforms with non-holonomic kinematic constraints and directional rangesensors. The resulting formation is a leader-follower system, where each follower agent tracksits leader agent at a specified angle and distance. No inter-agent communication is required toexecute the controls. A switching Kalman filter is introduced for active sensing, and robustnessis demonstrated in experiments and simulations with Khepera II robots.In Paper B, an optimization-based adaptive Kalman filteringmethod is proposed. The methodproduces an estimate of the process noise covariance matrix Q by solving an optimization problemover a short window of data. The algorithm recovers the observations h(x) from a system˙ x = f (x), y = h(x)+v without a priori knowledge of system dynamics. The algorithm is evaluatedin simulations and a tracking example is included, for a target with coupled and nonlinearkinematics. In Paper C, we consider the problem of estimating a closed curve in R2 based on noisecontaminated samples. A recursive control theoretic smoothing spline approach is proposed, thatyields an initial estimate of the curve and subsequently computes refinements of the estimateiteratively. Periodic splines are generated by minimizing a cost function subject to constraintsimposed by a linear control system. The optimal control problem is shown to be proper, andsufficient optimality conditions are derived for a special case of the problem using Hamilton-Jacobi-Bellman theory.Paper D continues the study of recursive control theoretic smoothing splines. A discretizationof the problem is derived, yielding an unconstrained quadratic programming problem. Aproof of convexity for the discretized problem is provided, and the recursive algorithm is evaluatedin simulations and experiments using a SICK laser scanner mounted on a PowerBot from ActivMedia Robotics. Finally, in Paper E we explore the issue of optimal smoothing for control theoretic smoothingsplines. The output of the control theoretic smoothing spline problem is essentially a tradeoff between faithfulness to measurement data and smoothness. This tradeoff is regulated by the socalled smoothing parameter. In Paper E, a method is developed for estimating the optimal valueof this smoothing parameter. The procedure is based on general cross validation and requires noa priori information about the underlying curve or level of noise in the measurements. / QC 20100722
590

Convergence in distribution for filtering processes associated to Hidden Markov Models with densities

Kaijser, Thomas January 2013 (has links)
A Hidden Markov Model generates two basic stochastic processes, a Markov chain, which is hidden, and an observation sequence. The filtering process of a Hidden Markov Model is, roughly speaking, the sequence of conditional distributions of the hidden Markov chain that is obtained as new observations are received. It is well-known, that the filtering process itself, is also a Markov chain. A classical, theoretical problem is to find conditions which implies that the distributions of the filtering process converge towards a unique limit measure. This problem goes back to a paper of D Blackwell for the case when the Markov chain takes its values in a finite set and it goes back to a paper of H Kunita for the case when the state space of the Markov chain is a compact Hausdor space. Recently, due to work by F Kochmann, J Reeds, P Chigansky and R van Handel, a necessary and sucient condition for the convergence of the distributions of the filtering process has been found for the case when the state space is finite. This condition has since been generalised to the case when the state space is denumerable. In this paper we generalise some of the previous results on convergence in distribution to the case when the Markov chain and the observation sequence of a Hidden Markov Model take their values in complete, separable, metric spaces; it has though been necessary to assume that both the transition probability function of the Markov chain and the transition probability function that generates the observation sequence have densities.

Page generated in 0.099 seconds