Return to search

Fast and Robust UAV to UAV Detection and Tracking Algorithm

<div>Unmanned Aerial Vehicle (UAV) technology is being increasingly used in a wide variety of applications ranging from remote sensing, to delivery, to security. As the number of UAVs increases, there is a growing need for UAV to UAV detection and tracking systems for both collision avoidance and coordination. Among possible solutions, autonamous “see-and-avoid” systems based on low-cost high-resolution video cameras offer the important advantages of light-weight and low power sensors. However, in order to be effective, camera based “see-and-avoid” systems will require sensitive, robust, and computationally efficient algorithms for UAV to UAV detect and tracking (U2U-D&T) from a moving camera.</div><div><br></div><div>In this thesis, we propose a general architecture for a highly accurate and computationally efficient U2U-D&T algorithms for detecting UAVs from a camera mounted on a
moving UAV platform. The thesis contains three studies of U2U-D&T algorithms.<br></div><div><br></div><div>In the first study, we present a new approach to detect and track other UAVs from a
single camera in our own UAV. Given the sequence of video frames, we estimate the background motion via perspective transformation model and then identify distinctive points
in the background subtracted image to detect moving objects. We find spatio-temporal
characteristics of each moving object through optical flow matching and then classify our
targets which have very different motion compared with background. We also perform
tracking based on Kalman filter to enforce the temporal consistency on our detection. The
algorithm is tested on real videos from UAVs and results show that it is effective to detect
and track small UAVs with limited computing resources.<br></div><div><br></div><div>In the second study, we present a new approach to detect and track UAVs from a single camera mounted on a different UAV. Initially, we estimate background motions via
a perspective transformation model and then identify moving object candidates in the
background subtracted image through deep learning classifier trained on manually labeled
datasets. For each moving object candidates, we find spatio-temporal traits through optical flow matching and then prune them based on their motion patterns compared with the
background. Kalman filter is applied on pruned moving objects to improve temporal consistency among the candidate detections. The algorithm was validated on video datasets
taken from a UAV. Results demonstrate that our algorithm can effectively detect and track
small UAVs with limited computing resources. </div><div><br></div><div>The system in the third study is based on a computationally efficient pipeline consisting
of moving object detection from a motion stabilized image, classification with a hybrid
neural network, followed by Kalmann tracking. The algorithm is validated using video
data collected from multiple fixed-wing UAVs that is manually ground-truthed and publicly
available. Results indicate that the proposed algorithm can be implemented on practical
hardware and robustly achieves highly accurate detection and tracking of even distant and
faint UAVs.<br></div>

  1. 10.25394/pgs.7651997.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/7651997
Date10 June 2019
CreatorsJing Li (6278105)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/Fast_and_Robust_UAV_to_UAV_Detection_and_Tracking_Algorithm/7651997

Page generated in 0.0017 seconds