• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Probabilistic localization and mapping in appearance space

Cummins, Mark January 2009 (has links)
This thesis is concerned with the problem of place recognition for mobile robots. How can a robot determine its location from an image or sequence of images, without any prior knowledge of its position, even in a world where many places look identical? We outline a new probabilistic approach to the problem, which we call Fast Appearance Based Mapping or FAB-MAP. Our map of the environment consists of a set of discrete locations, each with an associated appearance model. For every observation collected by the robot, we compute a probability distribution over the map, and either create a new location or update our belief about the appearance of an existing location. The technique can be seen as a new type of SLAM algorithm, where the appearance of locations (rather than their position) is subject to estimation. Unlike existing SLAM systems, our appearance based technique does not rely on keeping track of the robot in any metric coordinate system. Thus it is applicable even when informative observations are available only intermittently. Solutions to the loop closure detection problem, the kidnapped robot problem and the multi-session mapping problem arise as special cases of our general approach. Abstract Our probabilistic model introduces several technical advances. The model incorporates correlations between visual features in a novel way, which is shown to improve system performance. Additionally, we explicitly compute an approximation to the partition function in our Bayesian formulation, which provides a natural probabilistic measure of when a new observation should be assigned to a location not already present in the map. The technique is applicable even in visually repetitive environments where many places look the same. Abstract Finally, we define two distinct approximate inference procedures for the model. The first of these is based on concentration inequalities and has general applicability beyond the problem considered in this thesis. The second approach, built on inverted index techniques, is tailored to our specific problem of place recognition, but achieves extreme efficiency, allowing us to apply FAB-MAP to navigation problems on the largest scale. The thesis concludes with a visual SLAM experiment on a trajectory 1,000 km long. The system successfully detects loop closures with close to 100% precision and requires average inference time of only 25 ms by the end of the trajectory.
2

Vision-only localisation under extreme appearance change

Linegar, Chris January 2016 (has links)
Robust localisation is a key requirement for autonomous vehicles. However, in order to achieve widespread adoption of this technology, we also require this function to be performed using low-cost hardware. Cameras are appealing due to their information-rich image content and low cost; however, camera-based localisation is difficult because of the problem of appearance change. For example, in outdoor en- vironments the appearance of the world can change dramatically and unpredictably with variations in lighting, weather, season and scene structure. We require autonomous vehicles to be robust under these challenging environmental conditions. This thesis presents Dub4, a vision-only localisation system for autonomous vehicles. The system is founded on the concept of experiences, where an "experience" is a visual memory which models the world under particular conditions. By allowing the system to build up and curate a map of these experiences, we are able to handle cyclic appearance change (lighting, weather and season) as well as adapt to slow structural change. We present a probabilistic framework for predicting which experiences are most likely to match successfully with the live image at run-time, conditioned on the robot's prior use of the map. In addition, we describe an unsupervised algorithm for detecting and modelling higher-level visual features in the environment for localisation. These features are trained on a per-experience basis and are robust to extreme changes in appearance, for example between rain and sun, or day and night. The system is tested on over 1500km of data, from urban and off-road environments, through sun, rain, snow, harsh lighting, at different times of the day and night, and through all seasons. In addition to this extensive offline testing, Dub4 has served as the primary localisation source on a number of autonomous vehicles, including the Oxford University's RobotCar, the 2016 Shell Eco-Marathon, the LUTZ PathFinder Project in Milton Keynes, and the GATEway Project in Greenwich, London.

Page generated in 0.0805 seconds