SeNic: An Open Source Dataset for sEMG-Based Gesture Recognition in Non-Ideal Conditions

In order to reduce the gap between the laboratory environment and actual use in daily life of human-machine interaction based on surface electromyogram (sEMG) intent recognition, this paper presents a benchmark dataset of sEMG in non-ideal conditions (<italic>SeNic</italic>). The dataset...

詳細記述

書誌詳細
出版年:IEEE Transactions on Neural Systems and Rehabilitation Engineering
主要な著者: Bo Zhu, Daohui Zhang, Yaqi Chu, Yalun Gu, Xingang Zhao
フォーマット: 論文
言語:英語
出版事項: IEEE 2022-01-01
主題:
オンライン・アクセス:https://ieeexplore.ieee.org/document/9771219/
その他の書誌記述
要約:In order to reduce the gap between the laboratory environment and actual use in daily life of human-machine interaction based on surface electromyogram (sEMG) intent recognition, this paper presents a benchmark dataset of sEMG in non-ideal conditions (<italic>SeNic</italic>). The dataset mainly consists of 8-channel sEMG signals, and electrode shifts from an 3D-printed annular ruler. A total of 36 subjects participate in our data acquisition experiments of 7 gestures in non-ideal conditions, where non-ideal factors of 1) electrode shifts, 2) individual difference, 3) muscle fatigue, 4) inter-day difference, and 5) arm postures are elaborately involved. Signals of sEMG are validated first in temporal and frequency domains. Results of recognizing gestures in ideal conditions indicate the high quality of the dataset. Adverse impacts in non-ideal conditions are further revealed in the amplitudes of these data and recognition accuracies. To be concluded, <italic>SeNic</italic> is a benchmark dataset that introduces several non-ideal factors which often degrade the robustness of sEMG-based systems. It could be used as a freely available dataset and a common platform for researchers in the sEMG-based recognition community. The benchmark dataset <italic>SeNic</italic> are available online via the website (<uri>https://github.com/bozhubo/SeNic</uri> and <uri>https://gitee.com/bozhubo/SeNic</uri>).
ISSN:1558-0210