Real Time Performer Positioning with Step and Direction Tracking using Wearable IMU Devices
碩士 === 國立政治大學 === 資訊科學系 === 107 === Recently, improvisational performance using wearable devices combined with virtual reality (VR) or interactive technology has become a new type of digital art performing. Our previous research results have developed a platform that can “capture” the body gesture...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2019
|
Online Access: | http://ndltd.ncl.edu.tw/handle/8grjwe |
id |
ndltd-TW-107NCCU5394029 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-107NCCU53940292019-09-17T03:40:09Z http://ndltd.ncl.edu.tw/handle/8grjwe Real Time Performer Positioning with Step and Direction Tracking using Wearable IMU Devices 穿戴六軸感測裝置之展演者的即時步伐方向追蹤定位 Tseng, Yao-Chang 曾珧彰 碩士 國立政治大學 資訊科學系 107 Recently, improvisational performance using wearable devices combined with virtual reality (VR) or interactive technology has become a new type of digital art performing. Our previous research results have developed a platform that can “capture” the body gesture using wearable devices to render appearance of virtual objects for art performance. However, it still need the real-time position tracking of the performer to make the performance smoothly and naturally. Previous related works regarding the positioning techniques mostly focused on the error distances. They cannot be directly adopted in the practical performing art due to unsatisfactory real-time position tracking. The goal of the research is to achieve acceptable tracking performance using only IMU wearable sensors. We inspired from many methods by lots of experiments, a real-time positioning with “foot-step” and “direction-judge” tracking algorithm is proposed to solve this problem. The experiment results are satisfactory with very good feasibility. We hope the platform can enrich the performing patterns in digital arts, and empower the cultural innovation and integration capability of software and hardware industry in Taiwan. Tsai, Tzu-Chieh 蔡子傑 2019 學位論文 ; thesis 54 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立政治大學 === 資訊科學系 === 107 === Recently, improvisational performance using wearable devices combined with virtual reality (VR) or interactive technology has become a new type of digital art performing. Our previous research results have developed a platform that can “capture” the body gesture using wearable devices to render appearance of virtual objects for art performance. However, it still need the real-time position tracking of the performer to make the performance smoothly and naturally.
Previous related works regarding the positioning techniques mostly focused on the error distances. They cannot be directly adopted in the practical performing art due to unsatisfactory real-time position tracking. The goal of the research is to achieve acceptable tracking performance using only IMU wearable sensors. We inspired from many methods by lots of experiments, a real-time positioning with “foot-step” and “direction-judge” tracking algorithm is proposed to solve this problem. The experiment results are satisfactory with very good feasibility. We hope the platform can enrich the performing patterns in digital arts, and empower the cultural innovation and integration capability of software and hardware industry in Taiwan.
|
author2 |
Tsai, Tzu-Chieh |
author_facet |
Tsai, Tzu-Chieh Tseng, Yao-Chang 曾珧彰 |
author |
Tseng, Yao-Chang 曾珧彰 |
spellingShingle |
Tseng, Yao-Chang 曾珧彰 Real Time Performer Positioning with Step and Direction Tracking using Wearable IMU Devices |
author_sort |
Tseng, Yao-Chang |
title |
Real Time Performer Positioning with Step and Direction Tracking using Wearable IMU Devices |
title_short |
Real Time Performer Positioning with Step and Direction Tracking using Wearable IMU Devices |
title_full |
Real Time Performer Positioning with Step and Direction Tracking using Wearable IMU Devices |
title_fullStr |
Real Time Performer Positioning with Step and Direction Tracking using Wearable IMU Devices |
title_full_unstemmed |
Real Time Performer Positioning with Step and Direction Tracking using Wearable IMU Devices |
title_sort |
real time performer positioning with step and direction tracking using wearable imu devices |
publishDate |
2019 |
url |
http://ndltd.ncl.edu.tw/handle/8grjwe |
work_keys_str_mv |
AT tsengyaochang realtimeperformerpositioningwithstepanddirectiontrackingusingwearableimudevices AT céngyáozhāng realtimeperformerpositioningwithstepanddirectiontrackingusingwearableimudevices AT tsengyaochang chuāndàiliùzhóugǎncèzhuāngzhìzhīzhǎnyǎnzhědejíshíbùfáfāngxiàngzhuīzōngdìngwèi AT céngyáozhāng chuāndàiliùzhóugǎncèzhuāngzhìzhīzhǎnyǎnzhědejíshíbùfáfāngxiàngzhuīzōngdìngwèi |
_version_ |
1719251010390589440 |