Comparison of Modern Highly Interactive Flicker-Free Steady State Motion Visual Evoked Potentials for Practical Brain–Computer Interfaces

Motion-based visual evoked potentials (mVEP) is a new emerging trend in the field of steady-state visual evoked potentials (SSVEP)-based brain–computer interfaces (BCI). In this paper, we introduce different movement-based stimulus patterns (steady-state motion visual evoked potentials—SSMVEP), with...

Full description

Bibliographic Details
Main Authors: Piotr Stawicki, Ivan Volosyak
Format: Article
Language:English
Published: MDPI AG 2020-09-01
Series:Brain Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3425/10/10/686
Description
Summary:Motion-based visual evoked potentials (mVEP) is a new emerging trend in the field of steady-state visual evoked potentials (SSVEP)-based brain–computer interfaces (BCI). In this paper, we introduce different movement-based stimulus patterns (steady-state motion visual evoked potentials—SSMVEP), without employing the typical flickering. The tested movement patterns for the visual stimuli included a pendulum-like movement, a flipping illusion, a checkerboard pulsation, checkerboard inverse arc pulsations, and reverse arc rotations, all with a spelling task consisting of 18 trials. In an online experiment with nine participants, the movement-based BCI systems were evaluated with an online four-target BCI-speller, in which each letter may be selected in three steps (three trials). For classification, the minimum energy combination and a filter bank approach were used. The following frequencies were utilized: 7.06 Hz, 7.50 Hz, 8.00 Hz, and 8.57 Hz, reaching an average accuracy between 97.22% and 100% and an average information transfer rate (ITR) between 15.42 bits/min and 33.92 bits/min. All participants successfully used the SSMVEP-based speller with all types of stimulation pattern. The most successful SSMVEP stimulus was the SSMVEP1 (pendulum-like movement), with the average results reaching 100% accuracy and 33.92 bits/min for the ITR.
ISSN:2076-3425