Part-Based Background-Aware Tracking for UAV With Convolutional Features

In recent years, visual tracking is a challenging task in UAV applications. The standard correlation filter (CF) has been extensively applied for UAV object tracking. However, the CF-based tracker severely suffers from boundary effects and cannot effectively cope with object occlusion, which results...

Full description

Bibliographic Details
Main Authors: Changhong Fu, Yinqiang Zhang, Ziyuan Huang, Ran Duan, Zongwu Xie
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8736485/
Description
Summary:In recent years, visual tracking is a challenging task in UAV applications. The standard correlation filter (CF) has been extensively applied for UAV object tracking. However, the CF-based tracker severely suffers from boundary effects and cannot effectively cope with object occlusion, which results in suboptimal performance. Besides, it is still a tough task to obtain an appearance model precisely with hand-crafted features. In this paper, a novel part-based tracker is proposed for the UAV. With successive cropping operations, the tracking object is separated into several parts. More specially, the background-aware correlation filters with different cropping matrices are applied. To estimate the translation and scale variation of the tracking object, a structured comparison, and a Bayesian inference approach are proposed, which jointly achieve a coarse-to-fine strategy. Moreover, an adaptive mechanism is used to update the local appearance model of each part with a Gaussian process regression method. To construct a better appearance model, features extracted from the convolutional neural network are utilized instead of hand-crafted features. Through extensive experiments, the proposed tracker reaches competitive performance on 123 challenging UAV image sequences and outperforms other 20 popular state-of-the-art visual trackers in terms of overall performance and different challenging attributes.
ISSN:2169-3536