Ego-motion Estimation Based on RGB-D Camera and Inertial Sensor
碩士 === 國立臺灣大學 === 資訊網路與多媒體研究所 === 103 === Ego-motion estimation has a wide variety of applications in robot control and automation. Proper local estimation of ego-motion benefits to recognize surrounding environment and recover the trajectory traversed for autonomous robot. In this thesis, we presen...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2015
|
Online Access: | http://ndltd.ncl.edu.tw/handle/01115649228044152260 |
Summary: | 碩士 === 國立臺灣大學 === 資訊網路與多媒體研究所 === 103 === Ego-motion estimation has a wide variety of applications in robot control and automation. Proper local estimation of ego-motion benefits to recognize surrounding environment and recover the trajectory traversed for autonomous robot. In this thesis, we present a system that estimates ego-motion by fusing key frame based visual odometry and inertial measurements. The hardware
of the system includes a RGB-D camera for capturing color and depth images and an Inertial Measurement Unit (IMU) for acquiring inertial measurements.
Motion of camera between two consecutive images is estimated by finding correspondences of visual features. Rigidity constraints are used to efficiently remove outliers from a set of initial correspondence. Moreover, we apply random sample consensus (RANSAC) to handle the effect of the remaining outliers in the motion estimation step. These strategies are reasonable to insure that the remaining correspondences which involved in motion estimation almost contain inliers.
Several experiments with different kind of camera movements are performed to show that the robustness and accuracy of the ego-motion estimation algorithm, and the ability of our system to handle the real scene data correctly.
|
---|