Accurate Initial State Estimation in a Monocular Visual–Inertial SLAM System

The fusion of monocular visual and inertial cues has become popular in robotics, unmanned vehicles and augmented reality fields. Recent results have shown that optimization-based fusion strategies outperform filtering strategies. Robust state estimation is the core capability for optimization-based...

Full description

Bibliographic Details
Main Authors: Xufu Mu, Jing Chen, Zixiang Zhou, Zhen Leng, Lei Fan
Format: Article
Language:English
Published: MDPI AG 2018-02-01
Series:Sensors
Subjects:
Online Access:http://www.mdpi.com/1424-8220/18/2/506
id doaj-c709e90f29094711bc9c855edc04e7a3
record_format Article
spelling doaj-c709e90f29094711bc9c855edc04e7a32020-11-24T21:52:00ZengMDPI AGSensors1424-82202018-02-0118250610.3390/s18020506s18020506Accurate Initial State Estimation in a Monocular Visual–Inertial SLAM SystemXufu Mu0Jing Chen1Zixiang Zhou2Zhen Leng3Lei Fan4School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, ChinaSchool of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, ChinaSchool of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, ChinaSchool of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, ChinaSchool of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, ChinaThe fusion of monocular visual and inertial cues has become popular in robotics, unmanned vehicles and augmented reality fields. Recent results have shown that optimization-based fusion strategies outperform filtering strategies. Robust state estimation is the core capability for optimization-based visual–inertial Simultaneous Localization and Mapping (SLAM) systems. As a result of the nonlinearity of visual–inertial systems, the performance heavily relies on the accuracy of initial values (visual scale, gravity, velocity and Inertial Measurement Unit (IMU) biases). Therefore, this paper aims to propose a more accurate initial state estimation method. On the basis of the known gravity magnitude, we propose an approach to refine the estimated gravity vector by optimizing the two-dimensional (2D) error state on its tangent space, then estimate the accelerometer bias separately, which is difficult to be distinguished under small rotation. Additionally, we propose an automatic termination criterion to determine when the initialization is successful. Once the initial state estimation converges, the initial estimated values are used to launch the nonlinear tightly coupled visual–inertial SLAM system. We have tested our approaches with the public EuRoC dataset. Experimental results show that the proposed methods can achieve good initial state estimation, the gravity refinement approach is able to efficiently speed up the convergence process of the estimated gravity vector, and the termination criterion performs well.http://www.mdpi.com/1424-8220/18/2/506visual–inertial SLAMinitial state estimationtermination criterion
collection DOAJ
language English
format Article
sources DOAJ
author Xufu Mu
Jing Chen
Zixiang Zhou
Zhen Leng
Lei Fan
spellingShingle Xufu Mu
Jing Chen
Zixiang Zhou
Zhen Leng
Lei Fan
Accurate Initial State Estimation in a Monocular Visual–Inertial SLAM System
Sensors
visual–inertial SLAM
initial state estimation
termination criterion
author_facet Xufu Mu
Jing Chen
Zixiang Zhou
Zhen Leng
Lei Fan
author_sort Xufu Mu
title Accurate Initial State Estimation in a Monocular Visual–Inertial SLAM System
title_short Accurate Initial State Estimation in a Monocular Visual–Inertial SLAM System
title_full Accurate Initial State Estimation in a Monocular Visual–Inertial SLAM System
title_fullStr Accurate Initial State Estimation in a Monocular Visual–Inertial SLAM System
title_full_unstemmed Accurate Initial State Estimation in a Monocular Visual–Inertial SLAM System
title_sort accurate initial state estimation in a monocular visual–inertial slam system
publisher MDPI AG
series Sensors
issn 1424-8220
publishDate 2018-02-01
description The fusion of monocular visual and inertial cues has become popular in robotics, unmanned vehicles and augmented reality fields. Recent results have shown that optimization-based fusion strategies outperform filtering strategies. Robust state estimation is the core capability for optimization-based visual–inertial Simultaneous Localization and Mapping (SLAM) systems. As a result of the nonlinearity of visual–inertial systems, the performance heavily relies on the accuracy of initial values (visual scale, gravity, velocity and Inertial Measurement Unit (IMU) biases). Therefore, this paper aims to propose a more accurate initial state estimation method. On the basis of the known gravity magnitude, we propose an approach to refine the estimated gravity vector by optimizing the two-dimensional (2D) error state on its tangent space, then estimate the accelerometer bias separately, which is difficult to be distinguished under small rotation. Additionally, we propose an automatic termination criterion to determine when the initialization is successful. Once the initial state estimation converges, the initial estimated values are used to launch the nonlinear tightly coupled visual–inertial SLAM system. We have tested our approaches with the public EuRoC dataset. Experimental results show that the proposed methods can achieve good initial state estimation, the gravity refinement approach is able to efficiently speed up the convergence process of the estimated gravity vector, and the termination criterion performs well.
topic visual–inertial SLAM
initial state estimation
termination criterion
url http://www.mdpi.com/1424-8220/18/2/506
work_keys_str_mv AT xufumu accurateinitialstateestimationinamonocularvisualinertialslamsystem
AT jingchen accurateinitialstateestimationinamonocularvisualinertialslamsystem
AT zixiangzhou accurateinitialstateestimationinamonocularvisualinertialslamsystem
AT zhenleng accurateinitialstateestimationinamonocularvisualinertialslamsystem
AT leifan accurateinitialstateestimationinamonocularvisualinertialslamsystem
_version_ 1725877470614257664