An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand

Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands...

Full description

Bibliographic Details
Main Authors: Jinhua Zhang, Baozeng Wang, Cheng Zhang, Yanqing Xiao, Michael Yu Wang
Format: Article
Language:English
Published: Frontiers Media S.A. 2019-03-01
Series:Frontiers in Neurorobotics
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fnbot.2019.00007/full
id doaj-aad359f2ddcd4359b4b4de6e5a674232
record_format Article
spelling doaj-aad359f2ddcd4359b4b4de6e5a6742322020-11-25T00:56:29ZengFrontiers Media S.A.Frontiers in Neurorobotics1662-52182019-03-011310.3389/fnbot.2019.00007420002An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot HandJinhua Zhang0Baozeng Wang1Cheng Zhang2Yanqing Xiao3Michael Yu Wang4Michael Yu Wang5Key Laboratory of Education Ministry for Modern Design and Rotor-Bearing System, School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, ChinaKey Laboratory of Education Ministry for Modern Design and Rotor-Bearing System, School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, ChinaKey Laboratory of Education Ministry for Modern Design and Rotor-Bearing System, School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, ChinaSchool of Biological Science and Medical Engineering, Beihang University, Beijing, ChinaKey Laboratory of Education Ministry for Modern Design and Rotor-Bearing System, School of Mechanical Engineering, Xi'an Jiaotong University, Xi'an, ChinaDepartments of Mechanical and Aerospace Engineering and Electronic and Computer Engineering, HKUST Robotics Institute, Hong Kong University of Science and Technology, Kowloon, Hong KongBrain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.https://www.frontiersin.org/article/10.3389/fnbot.2019.00007/fullelectroencephalogram (EEG)electromyogram (EMG)electrooculogram (EOG)multimodal human-machine interface (mHMI)soft robot hand
collection DOAJ
language English
format Article
sources DOAJ
author Jinhua Zhang
Baozeng Wang
Cheng Zhang
Yanqing Xiao
Michael Yu Wang
Michael Yu Wang
spellingShingle Jinhua Zhang
Baozeng Wang
Cheng Zhang
Yanqing Xiao
Michael Yu Wang
Michael Yu Wang
An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
Frontiers in Neurorobotics
electroencephalogram (EEG)
electromyogram (EMG)
electrooculogram (EOG)
multimodal human-machine interface (mHMI)
soft robot hand
author_facet Jinhua Zhang
Baozeng Wang
Cheng Zhang
Yanqing Xiao
Michael Yu Wang
Michael Yu Wang
author_sort Jinhua Zhang
title An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_short An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_full An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_fullStr An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_full_unstemmed An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_sort eeg/emg/eog-based multimodal human-machine interface to real-time control of a soft robot hand
publisher Frontiers Media S.A.
series Frontiers in Neurorobotics
issn 1662-5218
publishDate 2019-03-01
description Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.
topic electroencephalogram (EEG)
electromyogram (EMG)
electrooculogram (EOG)
multimodal human-machine interface (mHMI)
soft robot hand
url https://www.frontiersin.org/article/10.3389/fnbot.2019.00007/full
work_keys_str_mv AT jinhuazhang aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT baozengwang aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT chengzhang aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT yanqingxiao aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT michaelyuwang aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT michaelyuwang aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT jinhuazhang eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT baozengwang eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT chengzhang eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT yanqingxiao eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT michaelyuwang eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT michaelyuwang eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
_version_ 1725226970308935680