Truncation thresholds based empirical mode decomposition approach for classification performance of motor imagery BCI systems


DAĞDEVİR E., TOKMAKÇI M.

CHAOS SOLITONS & FRACTALS, cilt.152, 2021 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 152
  • Basım Tarihi: 2021
  • Doi Numarası: 10.1016/j.chaos.2021.111450
  • Dergi Adı: CHAOS SOLITONS & FRACTALS
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, INSPEC, zbMATH
  • Anahtar Kelimeler: Motor Imagery, EEG, BCI, Signal processing, Classification performance
  • Erciyes Üniversitesi Adresli: Evet

Özet

Electroencephalogram (EEG) signals classification, which are important for brain computer interfaces (BCI) systems, is extremely difficult due to the inherent complexity and tendency to artifact properties of the signals. In this paper, a novel methodology based on Truncation Thresholds (TT) method based Empirical Mode Decomposition (EMD) method and statistical Common Spatial Pattern (CSP) feature extraction method is proposed to classified left and right hand imaginary movements from EEG signals. The TT method is used to change the selected local maximum and minimum points with EMD to distinguish more accurately the hidden information about the motor imagery cover the sub-bands in the frequency domain in addition to remove the blinking electrooculography (EOG) artefacts. TT method is performed to raw EEG signals. Then, statistical spatial features are extracted with CSP method from each Intrinsic Modal Component (IMF) which is created by used the EEG signals with the EMD method. Finally, the extracted features are fed to three different classifiers which are SVM, KNN and LDA. The proposed methodology is applied to our dataset and public BCI Competition IV-2b dataset. The results show that the proposed methodology provides accuracy of 97% and 94% with using LDA classifier for our dataset and with using KNN classifier for BCI Competition IV-2b dataset, respectively. (c) 2021 Elsevier Ltd. All rights reserved.