An Implementation of Gesture Recognition Using Android Smart phone

碩士 === 南台科技大學 === 電子工程系 === 101 === With the advance of technology in recent years, some electronic products which can be controlled by gesture have been developed, and have a lot of impact on the design of user interface of their kind, such as X-box Kinect game console. Therefore, using gesture to...

Full description

Bibliographic Details
Main Authors: HSU,YU-HAO, 許育豪
Other Authors: HSUEH,YUN-TAI
Format: Others
Language:zh-TW
Published: 102
Online Access:http://ndltd.ncl.edu.tw/handle/32253367400938265752
id ndltd-TW-101STUT8428016
record_format oai_dc
spelling ndltd-TW-101STUT84280162015-10-13T23:10:33Z http://ndltd.ncl.edu.tw/handle/32253367400938265752 An Implementation of Gesture Recognition Using Android Smart phone 使用Android手機實現手勢辨識 HSU,YU-HAO 許育豪 碩士 南台科技大學 電子工程系 101 With the advance of technology in recent years, some electronic products which can be controlled by gesture have been developed, and have a lot of impact on the design of user interface of their kind, such as X-box Kinect game console. Therefore, using gesture to fulfill man–machine interaction has been one of the hottest research topic. So far, people are used to use buttons, keyboard or touch panel to control electronic products. In the near future, electronic products will use computer vision which can recognize human gesture to interact with human. In this paper, we use the android smart phone to implement a gesture recognition system. Eclipse integrate development environment is used to developandroid program, and OpenCV library is used to do image processing . Implemented in two ways, First method the image to be processed is taken by the camera built in the android smart phone. First of all, edge detection is applied to the image to form an edge detected image, and then the edge detected image is compared with the standard gesture image library. In practice, five image are taken every second, and each image is processed immediately right after been taken. The process of image is as follows: firstly, convert the image to OpenCV image buffer, secondly, using OpenCV cvCanny API to edge detect the image, thirdly, normalize the image obtained in the previous step to be able to be compared to the standard gesture library, and finally compare to the standard gesture library , a cross correlation is used to calculate the similarity of the edge detected image and image in the standard gesture library. The higher the cross correlation is, the higher similarity is. Therefore, the gesture recognition can be obtained by this way, and achieve man-machine interaction. The second method , First find the gesture biggest contour following can calculate the center point, each finger and reasonable angle ,after find fingertips and labeled with names of the corresponding, finally the center point with each finger painting on the line and painting on fingertips, as a result ,gesture recognition can be done. In this study, we use sign language images as the standard gesture library, so the result of this study can be applied to both gesture control and communication between the deaf and the mute. And also allow the deaf and the mute can use their already familiar sign language to control or use electronic products. HSUEH,YUN-TAI 薛雲太 102 學位論文 ; thesis 40 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 南台科技大學 === 電子工程系 === 101 === With the advance of technology in recent years, some electronic products which can be controlled by gesture have been developed, and have a lot of impact on the design of user interface of their kind, such as X-box Kinect game console. Therefore, using gesture to fulfill man–machine interaction has been one of the hottest research topic. So far, people are used to use buttons, keyboard or touch panel to control electronic products. In the near future, electronic products will use computer vision which can recognize human gesture to interact with human. In this paper, we use the android smart phone to implement a gesture recognition system. Eclipse integrate development environment is used to developandroid program, and OpenCV library is used to do image processing . Implemented in two ways, First method the image to be processed is taken by the camera built in the android smart phone. First of all, edge detection is applied to the image to form an edge detected image, and then the edge detected image is compared with the standard gesture image library. In practice, five image are taken every second, and each image is processed immediately right after been taken. The process of image is as follows: firstly, convert the image to OpenCV image buffer, secondly, using OpenCV cvCanny API to edge detect the image, thirdly, normalize the image obtained in the previous step to be able to be compared to the standard gesture library, and finally compare to the standard gesture library , a cross correlation is used to calculate the similarity of the edge detected image and image in the standard gesture library. The higher the cross correlation is, the higher similarity is. Therefore, the gesture recognition can be obtained by this way, and achieve man-machine interaction. The second method , First find the gesture biggest contour following can calculate the center point, each finger and reasonable angle ,after find fingertips and labeled with names of the corresponding, finally the center point with each finger painting on the line and painting on fingertips, as a result ,gesture recognition can be done. In this study, we use sign language images as the standard gesture library, so the result of this study can be applied to both gesture control and communication between the deaf and the mute. And also allow the deaf and the mute can use their already familiar sign language to control or use electronic products.
author2 HSUEH,YUN-TAI
author_facet HSUEH,YUN-TAI
HSU,YU-HAO
許育豪
author HSU,YU-HAO
許育豪
spellingShingle HSU,YU-HAO
許育豪
An Implementation of Gesture Recognition Using Android Smart phone
author_sort HSU,YU-HAO
title An Implementation of Gesture Recognition Using Android Smart phone
title_short An Implementation of Gesture Recognition Using Android Smart phone
title_full An Implementation of Gesture Recognition Using Android Smart phone
title_fullStr An Implementation of Gesture Recognition Using Android Smart phone
title_full_unstemmed An Implementation of Gesture Recognition Using Android Smart phone
title_sort implementation of gesture recognition using android smart phone
publishDate 102
url http://ndltd.ncl.edu.tw/handle/32253367400938265752
work_keys_str_mv AT hsuyuhao animplementationofgesturerecognitionusingandroidsmartphone
AT xǔyùháo animplementationofgesturerecognitionusingandroidsmartphone
AT hsuyuhao shǐyòngandroidshǒujīshíxiànshǒushìbiànshí
AT xǔyùháo shǐyòngandroidshǒujīshíxiànshǒushìbiànshí
AT hsuyuhao implementationofgesturerecognitionusingandroidsmartphone
AT xǔyùháo implementationofgesturerecognitionusingandroidsmartphone
_version_ 1718084714426793984