Real-Time Tracking of Guidewire Robot Tips Using Deep Convolutional Neural Networks on Successive Localized Frames
Studies are proceeded to stabilize cardiac surgery using thin micro-guidewires and catheter robots. To control the robot to a desired position and pose, it is necessary to accurately track the robot tip in real time but tracking and accurately delineating the thin and small tip is challenging. To ad...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2019-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8886572/ |
id |
doaj-2118b37075ae42e9a9804d2197ab16bc |
---|---|
record_format |
Article |
spelling |
doaj-2118b37075ae42e9a9804d2197ab16bc2021-03-30T00:42:46ZengIEEEIEEE Access2169-35362019-01-01715974315975310.1109/ACCESS.2019.29502638886572Real-Time Tracking of Guidewire Robot Tips Using Deep Convolutional Neural Networks on Successive Localized FramesIhsan Ullah0https://orcid.org/0000-0002-6314-7769Philip Chikontwe1Sang Hyun Park2https://orcid.org/0000-0001-7476-1046Department of Robotics Engineering, Daegu Gyeonbuk Institute of Science and Technology, Daegu, South KoreaDepartment of Robotics Engineering, Daegu Gyeonbuk Institute of Science and Technology, Daegu, South KoreaDepartment of Robotics Engineering, Daegu Gyeonbuk Institute of Science and Technology, Daegu, South KoreaStudies are proceeded to stabilize cardiac surgery using thin micro-guidewires and catheter robots. To control the robot to a desired position and pose, it is necessary to accurately track the robot tip in real time but tracking and accurately delineating the thin and small tip is challenging. To address this problem, a novel image analysis-based tracking method using deep convolutional neural networks (CNN) has been proposed in this paper. The proposed tracker consists of two parts; (1) a detection network for rough detection of the tip position and (2) a segmentation network for accurate tip delineation near the tip position. To learn a robust real-time tracker, we extract small image patches, including the tip in successive frames and then learn the informative spatial and motion features for the segmentation network. During inference, the tip bounding box is first estimated in the initial frame via the detection network, thereafter tip delineation is consecutively performed through the segmentation network in the following frames. The proposed method enables accurate delineation of the tip in real time and automatically restarts tracking via the detection network when tracking fails in challenging frames. Experimental results show that the proposed method achieves better tracking accuracy than existing methods, with a considerable real-time speed of 19ms.https://ieeexplore.ieee.org/document/8886572/Convolutional neural networksmicro-robot trackingguidewire trackingpatch-wise segmentation |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Ihsan Ullah Philip Chikontwe Sang Hyun Park |
spellingShingle |
Ihsan Ullah Philip Chikontwe Sang Hyun Park Real-Time Tracking of Guidewire Robot Tips Using Deep Convolutional Neural Networks on Successive Localized Frames IEEE Access Convolutional neural networks micro-robot tracking guidewire tracking patch-wise segmentation |
author_facet |
Ihsan Ullah Philip Chikontwe Sang Hyun Park |
author_sort |
Ihsan Ullah |
title |
Real-Time Tracking of Guidewire Robot Tips Using Deep Convolutional Neural Networks on Successive Localized Frames |
title_short |
Real-Time Tracking of Guidewire Robot Tips Using Deep Convolutional Neural Networks on Successive Localized Frames |
title_full |
Real-Time Tracking of Guidewire Robot Tips Using Deep Convolutional Neural Networks on Successive Localized Frames |
title_fullStr |
Real-Time Tracking of Guidewire Robot Tips Using Deep Convolutional Neural Networks on Successive Localized Frames |
title_full_unstemmed |
Real-Time Tracking of Guidewire Robot Tips Using Deep Convolutional Neural Networks on Successive Localized Frames |
title_sort |
real-time tracking of guidewire robot tips using deep convolutional neural networks on successive localized frames |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2019-01-01 |
description |
Studies are proceeded to stabilize cardiac surgery using thin micro-guidewires and catheter robots. To control the robot to a desired position and pose, it is necessary to accurately track the robot tip in real time but tracking and accurately delineating the thin and small tip is challenging. To address this problem, a novel image analysis-based tracking method using deep convolutional neural networks (CNN) has been proposed in this paper. The proposed tracker consists of two parts; (1) a detection network for rough detection of the tip position and (2) a segmentation network for accurate tip delineation near the tip position. To learn a robust real-time tracker, we extract small image patches, including the tip in successive frames and then learn the informative spatial and motion features for the segmentation network. During inference, the tip bounding box is first estimated in the initial frame via the detection network, thereafter tip delineation is consecutively performed through the segmentation network in the following frames. The proposed method enables accurate delineation of the tip in real time and automatically restarts tracking via the detection network when tracking fails in challenging frames. Experimental results show that the proposed method achieves better tracking accuracy than existing methods, with a considerable real-time speed of 19ms. |
topic |
Convolutional neural networks micro-robot tracking guidewire tracking patch-wise segmentation |
url |
https://ieeexplore.ieee.org/document/8886572/ |
work_keys_str_mv |
AT ihsanullah realtimetrackingofguidewirerobottipsusingdeepconvolutionalneuralnetworksonsuccessivelocalizedframes AT philipchikontwe realtimetrackingofguidewirerobottipsusingdeepconvolutionalneuralnetworksonsuccessivelocalizedframes AT sanghyunpark realtimetrackingofguidewirerobottipsusingdeepconvolutionalneuralnetworksonsuccessivelocalizedframes |
_version_ |
1724187887007694848 |