VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems

Indoor environments have abundant presence of high-level semantic information which can provide a better understanding of the environment for robots to improve the uncertainty in their pose estimate. Although semantic information has proved to be useful, there are several challenges faced by the res...

Full description

Bibliographic Details
Main Authors: Hriday Bavle, Paloma De La Puente, Jonathan P. How, Pascual Campoy
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9045978/
id doaj-2c6e676e5e674f04bb670adb3b2a1698
record_format Article
spelling doaj-2c6e676e5e674f04bb670adb3b2a16982021-03-30T01:30:01ZengIEEEIEEE Access2169-35362020-01-018607046071810.1109/ACCESS.2020.29831219045978VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic SystemsHriday Bavle0https://orcid.org/0000-0002-1732-0647Paloma De La Puente1https://orcid.org/0000-0002-8652-0300Jonathan P. How2https://orcid.org/0000-0001-8576-1930Pascual Campoy3https://orcid.org/0000-0002-9894-2009Centre for Automation and Robotics, Computer Vision and Aerial Robotics Group, Universidad Politécnica de Madrid (UPM-CSIC), Madrid, SpainCentre for Automation and Robotics, Computer Vision and Aerial Robotics Group, Universidad Politécnica de Madrid (UPM-CSIC), Madrid, SpainAerospace Controls Laboratory, Massachusetts Institute of Technology (MIT), Cambridge, MA, USACentre for Automation and Robotics, Computer Vision and Aerial Robotics Group, Universidad Politécnica de Madrid (UPM-CSIC), Madrid, SpainIndoor environments have abundant presence of high-level semantic information which can provide a better understanding of the environment for robots to improve the uncertainty in their pose estimate. Although semantic information has proved to be useful, there are several challenges faced by the research community to accurately perceive, extract and utilize such semantic information from the environment. In order to address these challenges, in this paper we present a lightweight and real-time visual semantic SLAM framework running on board aerial robotic platforms. This novel method combines low-level visual/visual-inertial odometry (VO/VIO) along with geometrical information corresponding to planar surfaces extracted from detected semantic objects. Extracting the planar surfaces from selected semantic objects provides enhanced robustness and makes it possible to precisely improve the metric estimates rapidly, simultaneously generalizing to several object instances irrespective of their shape and size. Our graph-based approach can integrate several state of the art VO/VIO algorithms along with the state of the art object detectors in order to estimate the complete 6DoF pose of the robot while simultaneously creating a sparse semantic map of the environment. No prior knowledge of the objects is required, which is a significant advantage over other works. We test our approach on a standard RGB-D dataset comparing its performance with the state of the art SLAM algorithms. We also perform several challenging indoor experiments validating our approach in presence of distinct environmental conditions and furthermore test it on board an aerial robot. Video:https://vimeo.com/368217703Released Code:https://bitbucket.org/hridaybavle/semantic_slam.git.https://ieeexplore.ieee.org/document/9045978/SLAMvisual SLAMvisual semantic SLAMautonomous aerial robotsUAVs
collection DOAJ
language English
format Article
sources DOAJ
author Hriday Bavle
Paloma De La Puente
Jonathan P. How
Pascual Campoy
spellingShingle Hriday Bavle
Paloma De La Puente
Jonathan P. How
Pascual Campoy
VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems
IEEE Access
SLAM
visual SLAM
visual semantic SLAM
autonomous aerial robots
UAVs
author_facet Hriday Bavle
Paloma De La Puente
Jonathan P. How
Pascual Campoy
author_sort Hriday Bavle
title VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems
title_short VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems
title_full VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems
title_fullStr VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems
title_full_unstemmed VPS-SLAM: Visual Planar Semantic SLAM for Aerial Robotic Systems
title_sort vps-slam: visual planar semantic slam for aerial robotic systems
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2020-01-01
description Indoor environments have abundant presence of high-level semantic information which can provide a better understanding of the environment for robots to improve the uncertainty in their pose estimate. Although semantic information has proved to be useful, there are several challenges faced by the research community to accurately perceive, extract and utilize such semantic information from the environment. In order to address these challenges, in this paper we present a lightweight and real-time visual semantic SLAM framework running on board aerial robotic platforms. This novel method combines low-level visual/visual-inertial odometry (VO/VIO) along with geometrical information corresponding to planar surfaces extracted from detected semantic objects. Extracting the planar surfaces from selected semantic objects provides enhanced robustness and makes it possible to precisely improve the metric estimates rapidly, simultaneously generalizing to several object instances irrespective of their shape and size. Our graph-based approach can integrate several state of the art VO/VIO algorithms along with the state of the art object detectors in order to estimate the complete 6DoF pose of the robot while simultaneously creating a sparse semantic map of the environment. No prior knowledge of the objects is required, which is a significant advantage over other works. We test our approach on a standard RGB-D dataset comparing its performance with the state of the art SLAM algorithms. We also perform several challenging indoor experiments validating our approach in presence of distinct environmental conditions and furthermore test it on board an aerial robot. Video:https://vimeo.com/368217703Released Code:https://bitbucket.org/hridaybavle/semantic_slam.git.
topic SLAM
visual SLAM
visual semantic SLAM
autonomous aerial robots
UAVs
url https://ieeexplore.ieee.org/document/9045978/
work_keys_str_mv AT hridaybavle vpsslamvisualplanarsemanticslamforaerialroboticsystems
AT palomadelapuente vpsslamvisualplanarsemanticslamforaerialroboticsystems
AT jonathanphow vpsslamvisualplanarsemanticslamforaerialroboticsystems
AT pascualcampoy vpsslamvisualplanarsemanticslamforaerialroboticsystems
_version_ 1724186878029070336