Sparse-to-Dense: Depth Prediction from Sparse Depth Samples and a Single Image

© 2018 IEEE. We consider the problem of dense depth prediction from a sparse set of depth measurements and a single RGB image. Since depth estimation from monocular images alone is inherently ambiguous and unreliable, to attain a higher level of robustness and accuracy, we introduce additional spars...

Full description

Bibliographic Details
Main Authors: Ma, Fangchang (Author), Karaman, Sertac (Author)
Other Authors: Massachusetts Institute of Technology. Laboratory for Information and Decision Systems (Contributor), Massachusetts Institute of Technology. Department of Aeronautics and Astronautics (Contributor)
Format: Article
Language:English
Published: Institute of Electrical and Electronics Engineers (IEEE), 2021-11-09T18:26:52Z.
Subjects:
Online Access:Get fulltext
Description
Summary:© 2018 IEEE. We consider the problem of dense depth prediction from a sparse set of depth measurements and a single RGB image. Since depth estimation from monocular images alone is inherently ambiguous and unreliable, to attain a higher level of robustness and accuracy, we introduce additional sparse depth samples, which are either acquired with a low-resolution depth sensor or computed via visual Simultaneous Localization and Mapping (SLAM) algorithms. We propose the use of a single deep regression network to learn directly from the RGB-D raw data, and explore the impact of number of depth samples on prediction accuracy. Our experiments show that, compared to using only RGB images, the addition of 100 spatially random depth samples reduces the prediction root-mean-square error by 50% on the NYU-Depth-v2 indoor dataset. It also boosts the percentage of reliable prediction from 59 % to 92 % on the KITTI dataset. We demonstrate two applications of the proposed algorithm: a plug-in module in SLAM to convert sparse maps to dense maps, and super-resolution for LiDARs. Software22https://github.com/fangchangma/sparse-to-dense and video demonstration33https://www.youtube.com/watch?v=vNIIT-M7×7Y are publicly available.