Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization

In this paper, we propose a new visual-inertial Simultaneous Localization and Mapping (SLAM) algorithm. With the tightly coupled sensor fusion of a global shutter monocular camera and a low-cost Inertial Measurement Unit (IMU), this algorithm is able to achieve robust and real-time estimates of the...

Full description

Bibliographic Details
Main Authors: Yi Liu, Zhong Chen, Wenjuan Zheng, Hao Wang, Jianguo Liu
Format: Article
Language:English
Published: MDPI AG 2017-11-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/17/11/2613
id doaj-b96a6289c14449b99865b97aa0d7fb53
record_format Article
spelling doaj-b96a6289c14449b99865b97aa0d7fb532020-11-24T21:21:45ZengMDPI AGSensors1424-82202017-11-011711261310.3390/s17112613s17112613Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable InitializationYi Liu0Zhong Chen1Wenjuan Zheng2Hao Wang3Jianguo Liu4National Key Laboratory of Science and Technology on Multi-Spectral Information Processing, School of Automation, Huazhong University of Science and Technology, Wuhan 430074, ChinaNational Key Laboratory of Science and Technology on Multi-Spectral Information Processing, School of Automation, Huazhong University of Science and Technology, Wuhan 430074, ChinaBeijing Aerospace Automatic Control Institute, Beijing 100854, ChinaBeijing Aerospace Automatic Control Institute, Beijing 100854, ChinaNational Key Laboratory of Science and Technology on Multi-Spectral Information Processing, School of Automation, Huazhong University of Science and Technology, Wuhan 430074, ChinaIn this paper, we propose a new visual-inertial Simultaneous Localization and Mapping (SLAM) algorithm. With the tightly coupled sensor fusion of a global shutter monocular camera and a low-cost Inertial Measurement Unit (IMU), this algorithm is able to achieve robust and real-time estimates of the sensor poses in unknown environment. To address the real-time visual-inertial fusion problem, we present a parallel framework with a novel IMU initialization method. Our algorithm also benefits from the novel IMU factor, the continuous preintegration method, the vision factor of directional error, the separability trick and the robust initialization criterion which can efficiently output reliable estimates in real-time on modern Central Processing Unit (CPU). Tremendous experiments also validate the proposed algorithm and prove it is comparable to the state-of-art method.https://www.mdpi.com/1424-8220/17/11/2613sensor fusionSLAMcomputer visioninertial navigationtightly coupled
collection DOAJ
language English
format Article
sources DOAJ
author Yi Liu
Zhong Chen
Wenjuan Zheng
Hao Wang
Jianguo Liu
spellingShingle Yi Liu
Zhong Chen
Wenjuan Zheng
Hao Wang
Jianguo Liu
Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization
Sensors
sensor fusion
SLAM
computer vision
inertial navigation
tightly coupled
author_facet Yi Liu
Zhong Chen
Wenjuan Zheng
Hao Wang
Jianguo Liu
author_sort Yi Liu
title Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization
title_short Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization
title_full Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization
title_fullStr Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization
title_full_unstemmed Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization
title_sort monocular visual-inertial slam: continuous preintegration and reliable initialization
publisher MDPI AG
series Sensors
issn 1424-8220
publishDate 2017-11-01
description In this paper, we propose a new visual-inertial Simultaneous Localization and Mapping (SLAM) algorithm. With the tightly coupled sensor fusion of a global shutter monocular camera and a low-cost Inertial Measurement Unit (IMU), this algorithm is able to achieve robust and real-time estimates of the sensor poses in unknown environment. To address the real-time visual-inertial fusion problem, we present a parallel framework with a novel IMU initialization method. Our algorithm also benefits from the novel IMU factor, the continuous preintegration method, the vision factor of directional error, the separability trick and the robust initialization criterion which can efficiently output reliable estimates in real-time on modern Central Processing Unit (CPU). Tremendous experiments also validate the proposed algorithm and prove it is comparable to the state-of-art method.
topic sensor fusion
SLAM
computer vision
inertial navigation
tightly coupled
url https://www.mdpi.com/1424-8220/17/11/2613
work_keys_str_mv AT yiliu monocularvisualinertialslamcontinuouspreintegrationandreliableinitialization
AT zhongchen monocularvisualinertialslamcontinuouspreintegrationandreliableinitialization
AT wenjuanzheng monocularvisualinertialslamcontinuouspreintegrationandreliableinitialization
AT haowang monocularvisualinertialslamcontinuouspreintegrationandreliableinitialization
AT jianguoliu monocularvisualinertialslamcontinuouspreintegrationandreliableinitialization
_version_ 1725998511760080896