Matches in SemOpenAlex for { <https://semopenalex.org/work/W4385780126> ?p ?o ?g. }
- W4385780126 endingPage "105312" @default.
- W4385780126 startingPage "105312" @default.
- W4385780126 abstract "Emotion recognition based on electroencephalography (EEG) is of great importance in the field of human–computer interaction. In recent years, deep learning methods, especially convolutional neural networks (CNNs), have shown great potentials in emotion recognition. However, full-channel EEG signals sometimes lead to redundant data and hardware complexity. In addition, the disadvantage of CNNs lies in its locality nature. It ignores the relationship among different channels in emotion recognition. As for channel selection, the time-domain and frequency-domain features are extracted. Then the ReliefF algorithm is used to select the initial channels with higher contribution to emotion recognition task based on these extracted features. Since the ReliefF algorithm only considers channels with high contribution, the redundancy among channels may be ignored. And thus, the max-relevance and min-redundancy (mRMR) algorithm is employed to reduce the redundancy and obtain the final channels. This algorithm is named ReliefF-mRMR. Inspired by EEGNet and capsule network (CapsNet), we combine the advantages of them and propose Caps-EEGNet. It can fully utilize the frequency and spatial information and consider the relationship among different channels. The appropriate channels selected by ReliefF-mRMR are mostly located at the frontal area, which is consistent with many other findings in relevant studies. The Caps-EEGNet method achieves average accuracy of 96.67%, 96.75% and 96.64% on valence, arousal and dominance dimensions of the DEAP dataset. And the performance on the DREAMER dataset is 91.12%, 92.6% and 93.74% accuracy on valence, arousal and dominance dimensions, respectively. Experimental results also outperform other state-of-the-art methods. Experimental results of Caps-EEGNet with 8 appropriate channels and all channels show that there is slight difference in accuracy while the computation with 8 channels is faster. In addition, the 8 appropriate channels selected by ReliefF-mRMR algorithm have a better performance than the 8 channels selected randomly. These findings are valuable for practical EEG‐based emotion recognition systems." @default.
- W4385780126 created "2023-08-13" @default.
- W4385780126 creator A5026385971 @default.
- W4385780126 creator A5060107207 @default.
- W4385780126 creator A5080445836 @default.
- W4385780126 creator A5082139155 @default.
- W4385780126 creator A5091654425 @default.
- W4385780126 date "2023-09-01" @default.
- W4385780126 modified "2023-10-17" @default.
- W4385780126 title "A novel caps-EEGNet combined with channel selection for EEG-based emotion recognition" @default.
- W4385780126 cites W1500895378 @default.
- W4385780126 cites W1808644423 @default.
- W4385780126 cites W1947251450 @default.
- W4385780126 cites W1999829689 @default.
- W4385780126 cites W2002055708 @default.
- W4385780126 cites W2015076179 @default.
- W4385780126 cites W2032254851 @default.
- W4385780126 cites W2036309320 @default.
- W4385780126 cites W2043181832 @default.
- W4385780126 cites W2059923560 @default.
- W4385780126 cites W2068946839 @default.
- W4385780126 cites W2100409538 @default.
- W4385780126 cites W2101972387 @default.
- W4385780126 cites W2131987814 @default.
- W4385780126 cites W2139564752 @default.
- W4385780126 cites W2155131749 @default.
- W4385780126 cites W2464929676 @default.
- W4385780126 cites W2523577021 @default.
- W4385780126 cites W2599124244 @default.
- W4385780126 cites W2610184357 @default.
- W4385780126 cites W2623355927 @default.
- W4385780126 cites W2734337238 @default.
- W4385780126 cites W2790404832 @default.
- W4385780126 cites W2790471600 @default.
- W4385780126 cites W2896297654 @default.
- W4385780126 cites W2901337091 @default.
- W4385780126 cites W2903462437 @default.
- W4385780126 cites W2913846632 @default.
- W4385780126 cites W2934123712 @default.
- W4385780126 cites W2960600329 @default.
- W4385780126 cites W2970007912 @default.
- W4385780126 cites W2994828478 @default.
- W4385780126 cites W3006383880 @default.
- W4385780126 cites W3006715241 @default.
- W4385780126 cites W3014163061 @default.
- W4385780126 cites W3027581678 @default.
- W4385780126 cites W3043664281 @default.
- W4385780126 cites W3086601978 @default.
- W4385780126 cites W3091907675 @default.
- W4385780126 cites W3102455230 @default.
- W4385780126 cites W3108087271 @default.
- W4385780126 cites W3119911037 @default.
- W4385780126 cites W3150499614 @default.
- W4385780126 cites W3172153656 @default.
- W4385780126 cites W3186240196 @default.
- W4385780126 cites W4200175924 @default.
- W4385780126 cites W4285102346 @default.
- W4385780126 doi "https://doi.org/10.1016/j.bspc.2023.105312" @default.
- W4385780126 hasPublicationYear "2023" @default.
- W4385780126 type Work @default.
- W4385780126 citedByCount "0" @default.
- W4385780126 crossrefType "journal-article" @default.
- W4385780126 hasAuthorship W4385780126A5026385971 @default.
- W4385780126 hasAuthorship W4385780126A5060107207 @default.
- W4385780126 hasAuthorship W4385780126A5080445836 @default.
- W4385780126 hasAuthorship W4385780126A5082139155 @default.
- W4385780126 hasAuthorship W4385780126A5091654425 @default.
- W4385780126 hasConcept C111919701 @default.
- W4385780126 hasConcept C118552586 @default.
- W4385780126 hasConcept C119857082 @default.
- W4385780126 hasConcept C121332964 @default.
- W4385780126 hasConcept C138885662 @default.
- W4385780126 hasConcept C152124472 @default.
- W4385780126 hasConcept C153180895 @default.
- W4385780126 hasConcept C154945302 @default.
- W4385780126 hasConcept C15744967 @default.
- W4385780126 hasConcept C168900304 @default.
- W4385780126 hasConcept C2779808786 @default.
- W4385780126 hasConcept C28490314 @default.
- W4385780126 hasConcept C41008148 @default.
- W4385780126 hasConcept C41895202 @default.
- W4385780126 hasConcept C522805319 @default.
- W4385780126 hasConcept C62520636 @default.
- W4385780126 hasConcept C81363708 @default.
- W4385780126 hasConceptScore W4385780126C111919701 @default.
- W4385780126 hasConceptScore W4385780126C118552586 @default.
- W4385780126 hasConceptScore W4385780126C119857082 @default.
- W4385780126 hasConceptScore W4385780126C121332964 @default.
- W4385780126 hasConceptScore W4385780126C138885662 @default.
- W4385780126 hasConceptScore W4385780126C152124472 @default.
- W4385780126 hasConceptScore W4385780126C153180895 @default.
- W4385780126 hasConceptScore W4385780126C154945302 @default.
- W4385780126 hasConceptScore W4385780126C15744967 @default.
- W4385780126 hasConceptScore W4385780126C168900304 @default.
- W4385780126 hasConceptScore W4385780126C2779808786 @default.
- W4385780126 hasConceptScore W4385780126C28490314 @default.
- W4385780126 hasConceptScore W4385780126C41008148 @default.
- W4385780126 hasConceptScore W4385780126C41895202 @default.