For robotic deployments in planetary worksite environments, map construction and navigation are essential for tasks such as base construction, scientific investigation, and in-situ resource utilization. However, operation in a planetary environment imposes sensing restrictions, as well as challenges due to the terrain. In this thesis, we develop enabling technologies for autonomous mapping and navigation by employing a panning laser rangefinder as our primary sensor on a rover platform.
The mapping task is addressed as a three-dimensional Simultaneous Localization and Mapping (3D SLAM) problem. During operation, long-range 360 degree scans are obtained at infrequent stops. These scans are aligned using a combination of sparse features and odometry measurements in a batch alignment framework, resulting in accurate maps of planetary worksite terrain.
For navigation, the panning laser rangefinder is configured to perform short, continuous sweeps while the rover is in motion. An appearance-based approach is taken, where laser intensity images are used to compute Visual Odometry (VO) estimates. We overcome the motion distortion issues by formulating the estimation problem in continuous time. This is facilitated by the introduction of Gaussian Process Gauss-Newton (GPGN), a novel algorithm for nonparametric, continuous-time, nonlinear, batch state estimation.
Extensive experimental validation is provided for both mapping and navigation components using data gathered at multiple planetary analogue test sites.
Identifer | oai:union.ndltd.org:TORONTO/oai:tspace.library.utoronto.ca:1807/43740 |
Date | 14 January 2014 |
Creators | Tong, Chi Hay |
Contributors | Barfoot, Timothy D. |
Source Sets | University of Toronto |
Language | en_ca |
Detected Language | English |
Type | Thesis |
Page generated in 0.002 seconds