RGB–D terrain perception and dense mapping for legged robots

This paper addresses the issues of unstructured terrain modeling for the purpose of navigation with legged robots. We present an improved elevation grid concept adopted to the specific requirements of a small legged robot with limited perceptual capabilities. We propose an extension of the elevation...

Full description

Bibliographic Details
Main Authors: Belter Dominik, Łabecki Przemysław, Fankhauser Péter, Siegwart Roland
Format: Article
Language:English
Published: Sciendo 2016-03-01
Series:International Journal of Applied Mathematics and Computer Science
Subjects:
Online Access:https://doi.org/10.1515/amcs-2016-0006
id doaj-a671491be8024ece8bb7e5a3933c7c0f
record_format Article
spelling doaj-a671491be8024ece8bb7e5a3933c7c0f2021-09-06T19:39:49ZengSciendoInternational Journal of Applied Mathematics and Computer Science2083-84922016-03-01261819710.1515/amcs-2016-0006amcs-2016-0006RGB–D terrain perception and dense mapping for legged robotsBelter Dominik0Łabecki Przemysław1Fankhauser Péter2Siegwart Roland3Institute of Control and Information Engineering, Poznań University of Technology, ul. Piotrowo 3A, 60-965 Poznań, PolandInstitute of Control and Information Engineering, Poznań University of Technology, ul. Piotrowo 3A, 60-965 Poznań, PolandAutonomous Systems Lab, ETH Zurich, LEE J 201, Leonhardstrasse 21, 8092 Zurich, SwitzerlandAutonomous Systems Lab, ETH Zurich, LEE J 201, Leonhardstrasse 21, 8092 Zurich, SwitzerlandThis paper addresses the issues of unstructured terrain modeling for the purpose of navigation with legged robots. We present an improved elevation grid concept adopted to the specific requirements of a small legged robot with limited perceptual capabilities. We propose an extension of the elevation grid update mechanism by incorporating a formal treatment of the spatial uncertainty. Moreover, this paper presents uncertainty models for a structured light RGB-D sensor and a stereo vision camera used to produce a dense depth map. The model for the uncertainty of the stereo vision camera is based on uncertainty propagation from calibration, through undistortion and rectification algorithms, allowing calculation of the uncertainty of measured 3D point coordinates. The proposed uncertainty models were used for the construction of a terrain elevation map using the Videre Design STOC stereo vision camera and Kinect-like range sensors. We provide experimental verification of the proposed mapping method, and a comparison with another recently published terrain mapping method for walking robots.https://doi.org/10.1515/amcs-2016-0006rgb-d perceptionelevation mappinguncertaintylegged robots
collection DOAJ
language English
format Article
sources DOAJ
author Belter Dominik
Łabecki Przemysław
Fankhauser Péter
Siegwart Roland
spellingShingle Belter Dominik
Łabecki Przemysław
Fankhauser Péter
Siegwart Roland
RGB–D terrain perception and dense mapping for legged robots
International Journal of Applied Mathematics and Computer Science
rgb-d perception
elevation mapping
uncertainty
legged robots
author_facet Belter Dominik
Łabecki Przemysław
Fankhauser Péter
Siegwart Roland
author_sort Belter Dominik
title RGB–D terrain perception and dense mapping for legged robots
title_short RGB–D terrain perception and dense mapping for legged robots
title_full RGB–D terrain perception and dense mapping for legged robots
title_fullStr RGB–D terrain perception and dense mapping for legged robots
title_full_unstemmed RGB–D terrain perception and dense mapping for legged robots
title_sort rgb–d terrain perception and dense mapping for legged robots
publisher Sciendo
series International Journal of Applied Mathematics and Computer Science
issn 2083-8492
publishDate 2016-03-01
description This paper addresses the issues of unstructured terrain modeling for the purpose of navigation with legged robots. We present an improved elevation grid concept adopted to the specific requirements of a small legged robot with limited perceptual capabilities. We propose an extension of the elevation grid update mechanism by incorporating a formal treatment of the spatial uncertainty. Moreover, this paper presents uncertainty models for a structured light RGB-D sensor and a stereo vision camera used to produce a dense depth map. The model for the uncertainty of the stereo vision camera is based on uncertainty propagation from calibration, through undistortion and rectification algorithms, allowing calculation of the uncertainty of measured 3D point coordinates. The proposed uncertainty models were used for the construction of a terrain elevation map using the Videre Design STOC stereo vision camera and Kinect-like range sensors. We provide experimental verification of the proposed mapping method, and a comparison with another recently published terrain mapping method for walking robots.
topic rgb-d perception
elevation mapping
uncertainty
legged robots
url https://doi.org/10.1515/amcs-2016-0006
work_keys_str_mv AT belterdominik rgbdterrainperceptionanddensemappingforleggedrobots
AT łabeckiprzemysław rgbdterrainperceptionanddensemappingforleggedrobots
AT fankhauserpeter rgbdterrainperceptionanddensemappingforleggedrobots
AT siegwartroland rgbdterrainperceptionanddensemappingforleggedrobots
_version_ 1717770004403847168