Using the Face Tracking Technology to Detect Head Action

碩士 === 淡江大學 === 資訊工程學系碩士班 === 101 === Current computer input device’s control method is mostly done by hand operations, but this mode of operation for people with disabilities, especially physical disabilities is quite inconvenient. There are two common computer accessibility for people with disabil...

Full description

Bibliographic Details
Main Authors: Chen-Yu Ko, 柯政宇
Other Authors: Jui-Fa Chen
Format: Others
Language:zh-TW
Published: 2013
Online Access:http://ndltd.ncl.edu.tw/handle/42696591685051689603
id ndltd-TW-101TKU05392013
record_format oai_dc
spelling ndltd-TW-101TKU053920132016-02-21T04:20:14Z http://ndltd.ncl.edu.tw/handle/42696591685051689603 Using the Face Tracking Technology to Detect Head Action 運用臉部追蹤系統於頭部動作偵測 Chen-Yu Ko 柯政宇 碩士 淡江大學 資訊工程學系碩士班 101 Current computer input device’s control method is mostly done by hand operations, but this mode of operation for people with disabilities, especially physical disabilities is quite inconvenient. There are two common computer accessibility for people with disabilities: eye-tracking and head tracking. Eye tracking express less action, and head tracking has three implement method: ultrasound, infrared and face tracking. The implementation use ultrasound or infrared both need to wear equipment, but face tracking is not required, to use for people with disabilities, it is more comfortable. In this thesis, we use Active Shape Model (ASM) to track user’s face feature point, and calculate the translate data and rotation data of the Head, then we use Exponentially-Weighted Moving Average (EWMA) filter to smooth the ''jittery'' raw-data which produce by user because of unconsciously head shaking, then use Levenshtein Distance algorithm and translation data to detect which defined action that user did, at last we match the head action data to the corresponding operation, providing people with disabilities a simple method to control computer. Jui-Fa Chen 陳瑞發 2013 學位論文 ; thesis 60 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 淡江大學 === 資訊工程學系碩士班 === 101 === Current computer input device’s control method is mostly done by hand operations, but this mode of operation for people with disabilities, especially physical disabilities is quite inconvenient. There are two common computer accessibility for people with disabilities: eye-tracking and head tracking. Eye tracking express less action, and head tracking has three implement method: ultrasound, infrared and face tracking. The implementation use ultrasound or infrared both need to wear equipment, but face tracking is not required, to use for people with disabilities, it is more comfortable. In this thesis, we use Active Shape Model (ASM) to track user’s face feature point, and calculate the translate data and rotation data of the Head, then we use Exponentially-Weighted Moving Average (EWMA) filter to smooth the ''jittery'' raw-data which produce by user because of unconsciously head shaking, then use Levenshtein Distance algorithm and translation data to detect which defined action that user did, at last we match the head action data to the corresponding operation, providing people with disabilities a simple method to control computer.
author2 Jui-Fa Chen
author_facet Jui-Fa Chen
Chen-Yu Ko
柯政宇
author Chen-Yu Ko
柯政宇
spellingShingle Chen-Yu Ko
柯政宇
Using the Face Tracking Technology to Detect Head Action
author_sort Chen-Yu Ko
title Using the Face Tracking Technology to Detect Head Action
title_short Using the Face Tracking Technology to Detect Head Action
title_full Using the Face Tracking Technology to Detect Head Action
title_fullStr Using the Face Tracking Technology to Detect Head Action
title_full_unstemmed Using the Face Tracking Technology to Detect Head Action
title_sort using the face tracking technology to detect head action
publishDate 2013
url http://ndltd.ncl.edu.tw/handle/42696591685051689603
work_keys_str_mv AT chenyuko usingthefacetrackingtechnologytodetectheadaction
AT kēzhèngyǔ usingthefacetrackingtechnologytodetectheadaction
AT chenyuko yùnyòngliǎnbùzhuīzōngxìtǒngyútóubùdòngzuòzhēncè
AT kēzhèngyǔ yùnyòngliǎnbùzhuīzōngxìtǒngyútóubùdòngzuòzhēncè
_version_ 1718192959458902016