Maximum-a-Posterior Stereo System Based on Sparse Gradient Point
碩士 === 國立成功大學 === 資訊工程學系碩博士班 === 101 === With the quick advancement of computer science and the recent rise of Kinect, the application of disparity map has become a popular research topic in the past few years. Today, the prices of imaging equipment are getting lower and lower. Compared with the mor...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2013
|
Online Access: | http://ndltd.ncl.edu.tw/handle/37968400904719008104 |
id |
ndltd-TW-101NCKU5392067 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-101NCKU53920672015-10-13T22:51:44Z http://ndltd.ncl.edu.tw/handle/37968400904719008104 Maximum-a-Posterior Stereo System Based on Sparse Gradient Point 建立在稀疏梯度點的最大化事後機率之立體匹配系統 Hua-HsuanHsu 徐華煊 碩士 國立成功大學 資訊工程學系碩博士班 101 With the quick advancement of computer science and the recent rise of Kinect, the application of disparity map has become a popular research topic in the past few years. Today, the prices of imaging equipment are getting lower and lower. Compared with the more expensive Kinect, the Stereo System, which acquires depth information with two cameras, becomes highly emphasized. This thesis provides a quickly Stereo system, and uses the information of sparse gradient points to find out all the possible similar points in the other image. Then, the disparity map is used to explore the possibility of each similar point being the corresponding point. Lastly, with the depth information of the neighboring points, the noises are removed and the result becomes smoother. As for the experiment, we make use of the left and right images provided by Middlebury Stereo Datasets as well as the ground truth. We process the left and right images with the above system to find the result, while comparing it with the ground truth to obtain precision. Jenn-Jier Lien 連震杰 2013 學位論文 ; thesis 57 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立成功大學 === 資訊工程學系碩博士班 === 101 === With the quick advancement of computer science and the recent rise of Kinect, the application of disparity map has become a popular research topic in the past few years. Today, the prices of imaging equipment are getting lower and lower. Compared with the more expensive Kinect, the Stereo System, which acquires depth information with two cameras, becomes highly emphasized. This thesis provides a quickly Stereo system, and uses the information of sparse gradient points to find out all the possible similar points in the other image. Then, the disparity map is used to explore the possibility of each similar point being the corresponding point. Lastly, with the depth information of the neighboring points, the noises are removed and the result becomes smoother. As for the experiment, we make use of the left and right images provided by Middlebury Stereo Datasets as well as the ground truth. We process the left and right images with the above system to find the result, while comparing it with the ground truth to obtain precision.
|
author2 |
Jenn-Jier Lien |
author_facet |
Jenn-Jier Lien Hua-HsuanHsu 徐華煊 |
author |
Hua-HsuanHsu 徐華煊 |
spellingShingle |
Hua-HsuanHsu 徐華煊 Maximum-a-Posterior Stereo System Based on Sparse Gradient Point |
author_sort |
Hua-HsuanHsu |
title |
Maximum-a-Posterior Stereo System Based on Sparse Gradient Point |
title_short |
Maximum-a-Posterior Stereo System Based on Sparse Gradient Point |
title_full |
Maximum-a-Posterior Stereo System Based on Sparse Gradient Point |
title_fullStr |
Maximum-a-Posterior Stereo System Based on Sparse Gradient Point |
title_full_unstemmed |
Maximum-a-Posterior Stereo System Based on Sparse Gradient Point |
title_sort |
maximum-a-posterior stereo system based on sparse gradient point |
publishDate |
2013 |
url |
http://ndltd.ncl.edu.tw/handle/37968400904719008104 |
work_keys_str_mv |
AT huahsuanhsu maximumaposteriorstereosystembasedonsparsegradientpoint AT xúhuáxuān maximumaposteriorstereosystembasedonsparsegradientpoint AT huahsuanhsu jiànlìzàixīshūtīdùdiǎndezuìdàhuàshìhòujīlǜzhīlìtǐpǐpèixìtǒng AT xúhuáxuān jiànlìzàixīshūtīdùdiǎndezuìdàhuàshìhòujīlǜzhīlìtǐpǐpèixìtǒng |
_version_ |
1718081368831819776 |