Human Robot Interactions Using Sensor Fusion Approach for Intelligent Service Robotics System

碩士 === 國立中正大學 === 電機工程所 === 97 === Mobile robots in recent years have been gradually integrated into our daily lives. Human Robot Interaction (HRI) becomes one of important topics. Traditionally, HRI systems such as voice command and other Human-Computer Interface interact with user passively. In th...

Full description

Bibliographic Details
Main Authors: Nai-Wen Chang, 張乃文
Other Authors: Ren-C Luo
Format: Others
Language:en_US
Published: 2009
Online Access:http://ndltd.ncl.edu.tw/handle/44087075635576032524
id ndltd-TW-097CCU05442113
record_format oai_dc
spelling ndltd-TW-097CCU054421132016-05-04T04:26:07Z http://ndltd.ncl.edu.tw/handle/44087075635576032524 Human Robot Interactions Using Sensor Fusion Approach for Intelligent Service Robotics System 多感測器融合人機互動應用於智慧服務型機器人系統之研究 Nai-Wen Chang 張乃文 碩士 國立中正大學 電機工程所 97 Mobile robots in recent years have been gradually integrated into our daily lives. Human Robot Interaction (HRI) becomes one of important topics. Traditionally, HRI systems such as voice command and other Human-Computer Interface interact with user passively. In this research, we try to make mobile robot interact with users more in active mode. Active interaction, automatic detection and following users are essential and fundamental functions. Robot with such functions can offer better service to users. For an intelligent service mobile robot the ability to track and follow target person is desirable. A robust method for tracking and following a target person with a small size mobile robot by integrating single vision sensor and laser range finder is proposed. The laser range finder and vision sensor have their respective drawbacks. To compensate the drawbacks of each sensor we present the complementary data fusion approach – Covariance Intersection, wherein it will complement the uncertainty of each sensor measurement and enhance the reliability of human’s locating position information. The Virtual Spring Model is the control rule of mobile robot that can smoothly tracking target person. Experimental results validate the robust performance of the method. The experimental results shows the vision tracking system can detect human’s upper half-body accurately and in LRF tracking system can detect human’s legs. By above systems, we can estimate the target person’s position. The complementary data fusion approach validate that it can reduce the uncertainty of target person’ position and using the virtual spring model to control robot which proves that the robot can follow target person smoothly. Ren-C Luo Huei-Yung Lin 羅仁權 林惠勇 2009 學位論文 ; thesis 106 en_US
collection NDLTD
language en_US
format Others
sources NDLTD
description 碩士 === 國立中正大學 === 電機工程所 === 97 === Mobile robots in recent years have been gradually integrated into our daily lives. Human Robot Interaction (HRI) becomes one of important topics. Traditionally, HRI systems such as voice command and other Human-Computer Interface interact with user passively. In this research, we try to make mobile robot interact with users more in active mode. Active interaction, automatic detection and following users are essential and fundamental functions. Robot with such functions can offer better service to users. For an intelligent service mobile robot the ability to track and follow target person is desirable. A robust method for tracking and following a target person with a small size mobile robot by integrating single vision sensor and laser range finder is proposed. The laser range finder and vision sensor have their respective drawbacks. To compensate the drawbacks of each sensor we present the complementary data fusion approach – Covariance Intersection, wherein it will complement the uncertainty of each sensor measurement and enhance the reliability of human’s locating position information. The Virtual Spring Model is the control rule of mobile robot that can smoothly tracking target person. Experimental results validate the robust performance of the method. The experimental results shows the vision tracking system can detect human’s upper half-body accurately and in LRF tracking system can detect human’s legs. By above systems, we can estimate the target person’s position. The complementary data fusion approach validate that it can reduce the uncertainty of target person’ position and using the virtual spring model to control robot which proves that the robot can follow target person smoothly.
author2 Ren-C Luo
author_facet Ren-C Luo
Nai-Wen Chang
張乃文
author Nai-Wen Chang
張乃文
spellingShingle Nai-Wen Chang
張乃文
Human Robot Interactions Using Sensor Fusion Approach for Intelligent Service Robotics System
author_sort Nai-Wen Chang
title Human Robot Interactions Using Sensor Fusion Approach for Intelligent Service Robotics System
title_short Human Robot Interactions Using Sensor Fusion Approach for Intelligent Service Robotics System
title_full Human Robot Interactions Using Sensor Fusion Approach for Intelligent Service Robotics System
title_fullStr Human Robot Interactions Using Sensor Fusion Approach for Intelligent Service Robotics System
title_full_unstemmed Human Robot Interactions Using Sensor Fusion Approach for Intelligent Service Robotics System
title_sort human robot interactions using sensor fusion approach for intelligent service robotics system
publishDate 2009
url http://ndltd.ncl.edu.tw/handle/44087075635576032524
work_keys_str_mv AT naiwenchang humanrobotinteractionsusingsensorfusionapproachforintelligentserviceroboticssystem
AT zhāngnǎiwén humanrobotinteractionsusingsensorfusionapproachforintelligentserviceroboticssystem
AT naiwenchang duōgǎncèqìrónghérénjīhùdòngyīngyòngyúzhìhuìfúwùxíngjīqìrénxìtǒngzhīyánjiū
AT zhāngnǎiwén duōgǎncèqìrónghérénjīhùdòngyīngyòngyúzhìhuìfúwùxíngjīqìrénxìtǒngzhīyánjiū
_version_ 1718258132729200640