Underwater Target Tracking Using Forward-Looking Sonar for Autonomous Underwater Vehicles

In the scenario where autonomous underwater vehicles (AUVs) carry out tasks, it is necessary to reliably estimate underwater-moving-target positioning. While cameras often give low-precision visibility in a limited field of view, the forward-looking sonar is still an attractive method for underwater...

Full description

Bibliographic Details
Main Authors: Tiedong Zhang, Shuwei Liu, Xiao He, Hai Huang, Kangda Hao
Format: Article
Language:English
Published: MDPI AG 2019-12-01
Series:Sensors
Subjects:
auv
Online Access:https://www.mdpi.com/1424-8220/20/1/102
Description
Summary:In the scenario where autonomous underwater vehicles (AUVs) carry out tasks, it is necessary to reliably estimate underwater-moving-target positioning. While cameras often give low-precision visibility in a limited field of view, the forward-looking sonar is still an attractive method for underwater sensing, which is especially effective for long-range tracking. This paper describes an online processing framework based on forward-looking-sonar (FLS) images, and presents a novel tracking approach based on a Gaussian particle filter (GPF) to resolve persistent multiple-target tracking in cluttered environments. First, the character of acoustic-vision images is considered, and methods of median filtering and region-growing segmentation were modified to improve image-processing results. Second, a generalized regression neural network was adopted to evaluate multiple features of target regions, and a representation of feature subsets was created to improve tracking performance. Thus, an adaptive fusion strategy is introduced to integrate feature cues into the observation model, and the complete procedure of underwater target tracking based on GPF is displayed. Results obtained on a real acoustic-vision AUV platform during sea trials are shown and discussed. These showed that the proposed method is feasible and effective in tracking targets in complex underwater environments.
ISSN:1424-8220