The Evaluation of Classifier Performance during Fitting Wrist and Finger Movement Task Based on Forearm HD-sEMG

The transmission of human body movement signals to other devices through wearable smart bracelets has attracted increasing attention in the field of human-machine interfaces. However, owing to the limited data collection range of wearable bracelets, it is necessary to study the relationship between...

Full description

Bibliographic Details
Main Authors: Chen, W. (Author), Dai, C. (Author), Duan, H. (Author)
Format: Article
Language:English
Published: Hindawi Limited 2022
Subjects:
Online Access:View Fulltext in Publisher
Description
Summary:The transmission of human body movement signals to other devices through wearable smart bracelets has attracted increasing attention in the field of human-machine interfaces. However, owing to the limited data collection range of wearable bracelets, it is necessary to study the relationship between the superposition of the wrist and fingers and their cooperative motions to simplify the data collection system of such devices. Multichannel high-density surface electromyogram (HD-sEMG) signals exhibit high spatial resolutions, and they can help improve the accuracy of the multichannel fitting. In this study, we quantified the HD-sEMG forearm spatial activation features of 256 channels of hand movement and performed a linear fitting of the data obtained for finger and wrist movements in order to verify the linear superposition relationship between the cooperative and independent movements of the wrist and fingers. This study aims to classify and predict the results of the fitting and measured fingers and wrist cooperative actions using four commonly adopted classifiers and evaluate the performance of the classifiers in gesture fitting. The results indicated that linear discriminant analysis affords the highest classification performance, whereas the random forest method achieved the worst performance. This study can serve as a guide for gesture signal simplification in the future. © 2022 Haiqiang Duan et al.
ISBN:1024123X (ISSN)
DOI:10.1155/2022/9594521