Fast Visual Tracking With Robustifying Kernelized Correlation Filters

Robust visual tracking is a challenging work because the target object suffers appearance variations over time. Tracking algorithms based on correlation filter have presently attracted much attention because of their high efficiency and computation speed. However, these algorithms can easily drift f...

Full description

Bibliographic Details
Main Authors: Qianbo Liu, Guoqing Hu, Md Mojahidul Islam
Format: Article
Language:English
Published: IEEE 2018-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8423606/
Description
Summary:Robust visual tracking is a challenging work because the target object suffers appearance variations over time. Tracking algorithms based on correlation filter have presently attracted much attention because of their high efficiency and computation speed. However, these algorithms can easily drift for the noisy updates. Moreover, they are out of action and cannot re-track when trackers failure caused by heavy occlusion or target being out of view. In this paper, we propose a robust correlation filter that is constructed by considering all the extracted target appearances from the initial image to the current image. The numerator and denominator of the filter model are updated separately instead of linearly interpolated only by storing the current model. Strategies, such as reducing feature dimensionality and interpolating correlation scores, are investigated to reduce computational cost for fast tracking. Occlusion and fast motion problems can be effectively solved by the expansion of the search area. In addition, model updates occur under the condition of a confidence metric (i.e., peak-to-sidelobe ratio) threshold. Comprehensive experiments were conducted on object tracking data sets and the results showed that our method performs well compared to the other competitive methods. Moreover, it runs on a single central processing unit at a speed of 69.5 frames per second, which is suitable for real-time application.
ISSN:2169-3536