Return to search

SPATIAL AND TEMPORAL SYSTEM CALIBRATION OF GNSS/INS-ASSISTED FRAME AND LINE CAMERAS ONBOARD UNMANNED AERIAL VEHICLES

<p>Unmanned aerial vehicles (UAVs)
equipped with imaging systems and integrated global navigation satellite system/inertial
navigation system (GNSS/INS) are used for a variety of applications. Disaster
relief, infrastructure monitoring, precision agriculture, and ecological
forestry growth monitoring are among some of the applications that utilize UAV
imaging systems. For most applications, accurate 3D spatial information from
the UAV imaging system is required. Deriving reliable 3D coordinates is
conditioned on accurate geometric calibration. Geometric calibration entails
both spatial and temporal calibration. Spatial calibration consists of
obtaining accurate internal characteristics of the imaging sensor as well as
estimating the mounting parameters between the imaging and the GNSS/INS units.
Temporal calibration ensures that there is little to no time delay between the
image timestamps and corresponding GNSS/INS position and orientation
timestamps. Manual and automated spatial calibration have been successfully
accomplished on a variety of platforms and sensors including UAVs equipped with
frame and push-broom line cameras. However, manual and automated temporal
calibration has not been demonstrated on both
frame and line camera systems without the use of ground control points (GCPs).
This research focuses on manual and automated spatial and temporal system
calibration for UAVs equipped with GNSS/INS frame and line camera systems. For
frame cameras, the research introduces two approaches (direct and indirect) to
correct for time delay between GNSS/INS recorded event markers and actual time
of image exposures. To ensure the best estimates of system parameters without
the use of ground control points, an optimal flight configuration for system
calibration while estimating time delay is rigorously derived. For line camera
systems, this research presents the direct approach to estimate system
calibration parameters including time delay during the bundle block adjustment.
The optimal flight configuration is also rigorously derived for line camera
systems and the bias impact analysis is concluded. This shows that the indirect
approach is not a feasible solution for push-broom line cameras onboard UAVs
due to the limited ability of line cameras to decouple system parameters and is
confirmed with experimental results. Lastly, this research demonstrates that
for frame and line camera systems, the direct approach can be fully-automated
by incorporating structure from motion (SfM) based tie point features. Methods
for feature detection and matching for frame and line camera systems are
presented. This research also presents the necessary changes in the bundle
adjustment with self-calibration to successfully incorporate a large amount of automatically-derived
tie points. For frame cameras, the results show that the direct and indirect
approach is capable of estimating and correcting this time delay. When a time
delay exists and the direct or indirect approach is applied, horizontal
accuracy of 1–3 times the ground sampling distance (GSD) can be achieved
without the use of any ground control points (GCPs). For line camera systems, the direct results
show that when a time delay exists and spatial and temporal calibration is
performed, vertical and horizontal accuracy are approximately that of the
ground sample distance (GSD) of the sensor. Furthermore, when a large
artificial time delay is introduced for line camera systems, the direct approach
still achieves accuracy less than the GSD of the system and performs 2.5-8
times better in the horizontal components and up to 18 times better in the
vertical component than when temporal calibration is not performed. Lastly, the
results show that automated tie points can be successfully extracted for frame
and line camera systems and that those tie point features can be incorporated
into a fully-automated bundle adjustment with self-calibration including time
delay estimation. The results show that this fully-automated calibration
accurately estimates system parameters and demonstrates absolute accuracy
similar to that of manually-measured tie/checkpoints without the use of GCPs.</p>

  1. 10.25394/pgs.12742568.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/12742568
Date31 July 2020
CreatorsLisa Marie Laforest (9188615)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/SPATIAL_AND_TEMPORAL_SYSTEM_CALIBRATION_OF_GNSS_INS-ASSISTED_FRAME_AND_LINE_CAMERAS_ONBOARD_UNMANNED_AERIAL_VEHICLES/12742568

Page generated in 0.0024 seconds