Matches in SemOpenAlex for { <https://semopenalex.org/work/W4381942695> ?p ?o ?g. }
- W4381942695 endingPage "2232" @default.
- W4381942695 startingPage "2232" @default.
- W4381942695 abstract "Automatic emotion recognition from electroencephalogram (EEG) signals can be considered as the main component of brain–computer interface (BCI) systems. In the previous years, many researchers in this direction have presented various algorithms for the automatic classification of emotions from EEG signals, and they have achieved promising results; however, lack of stability, high error, and low accuracy are still considered as the central gaps in this research. For this purpose, obtaining a model with the precondition of stability, high accuracy, and low error is considered essential for the automatic classification of emotions. In this research, a model based on Deep Convolutional Neural Networks (DCNNs) is presented, which can classify three positive, negative, and neutral emotions from EEG signals based on musical stimuli with high reliability. For this purpose, a comprehensive database of EEG signals has been collected while volunteers were listening to positive and negative music in order to stimulate the emotional state. The architecture of the proposed model consists of a combination of six convolutional layers and two fully connected layers. In this research, different feature learning and hand-crafted feature selection/extraction algorithms were investigated and compared with each other in order to classify emotions. The proposed model for the classification of two classes (positive and negative) and three classes (positive, neutral, and negative) of emotions had 98% and 96% accuracy, respectively, which is very promising compared with the results of previous research. In order to evaluate more fully, the proposed model was also investigated in noisy environments; with a wide range of different SNRs, the classification accuracy was still greater than 90%. Due to the high performance of the proposed model, it can be used in brain–computer user environments." @default.
- W4381942695 created "2023-06-26" @default.
- W4381942695 creator A5029607373 @default.
- W4381942695 creator A5048822873 @default.
- W4381942695 creator A5054320766 @default.
- W4381942695 creator A5063416360 @default.
- W4381942695 date "2023-05-14" @default.
- W4381942695 modified "2023-10-16" @default.
- W4381942695 title "Customized 2D CNN Model for the Automatic Emotion Recognition Based on EEG Signals" @default.
- W4381942695 cites W2054341704 @default.
- W4381942695 cites W2126707560 @default.
- W4381942695 cites W2129225251 @default.
- W4381942695 cites W2165880777 @default.
- W4381942695 cites W2513975684 @default.
- W4381942695 cites W2613375858 @default.
- W4381942695 cites W2784665486 @default.
- W4381942695 cites W2790814155 @default.
- W4381942695 cites W2792191740 @default.
- W4381942695 cites W2799331981 @default.
- W4381942695 cites W2810418809 @default.
- W4381942695 cites W2922265930 @default.
- W4381942695 cites W2946526173 @default.
- W4381942695 cites W2963873807 @default.
- W4381942695 cites W2969889150 @default.
- W4381942695 cites W3000232078 @default.
- W4381942695 cites W3001587372 @default.
- W4381942695 cites W3008290004 @default.
- W4381942695 cites W3012159372 @default.
- W4381942695 cites W3045665366 @default.
- W4381942695 cites W3111259518 @default.
- W4381942695 cites W3114912940 @default.
- W4381942695 cites W3124539583 @default.
- W4381942695 cites W3153380589 @default.
- W4381942695 cites W3193300679 @default.
- W4381942695 cites W4214839300 @default.
- W4381942695 cites W4292338326 @default.
- W4381942695 cites W4300862723 @default.
- W4381942695 cites W4306149369 @default.
- W4381942695 cites W4308513655 @default.
- W4381942695 cites W4313573003 @default.
- W4381942695 cites W4315778234 @default.
- W4381942695 cites W4327955635 @default.
- W4381942695 cites W4362638295 @default.
- W4381942695 doi "https://doi.org/10.3390/electronics12102232" @default.
- W4381942695 hasPublicationYear "2023" @default.
- W4381942695 type Work @default.
- W4381942695 citedByCount "4" @default.
- W4381942695 countsByYear W43819426952023 @default.
- W4381942695 crossrefType "journal-article" @default.
- W4381942695 hasAuthorship W4381942695A5029607373 @default.
- W4381942695 hasAuthorship W4381942695A5048822873 @default.
- W4381942695 hasAuthorship W4381942695A5054320766 @default.
- W4381942695 hasAuthorship W4381942695A5063416360 @default.
- W4381942695 hasBestOaLocation W43819426951 @default.
- W4381942695 hasConcept C112972136 @default.
- W4381942695 hasConcept C118552586 @default.
- W4381942695 hasConcept C119857082 @default.
- W4381942695 hasConcept C138885662 @default.
- W4381942695 hasConcept C153180895 @default.
- W4381942695 hasConcept C154945302 @default.
- W4381942695 hasConcept C15744967 @default.
- W4381942695 hasConcept C173201364 @default.
- W4381942695 hasConcept C206310091 @default.
- W4381942695 hasConcept C2776401178 @default.
- W4381942695 hasConcept C28490314 @default.
- W4381942695 hasConcept C41008148 @default.
- W4381942695 hasConcept C41895202 @default.
- W4381942695 hasConcept C522805319 @default.
- W4381942695 hasConcept C52622490 @default.
- W4381942695 hasConcept C81363708 @default.
- W4381942695 hasConceptScore W4381942695C112972136 @default.
- W4381942695 hasConceptScore W4381942695C118552586 @default.
- W4381942695 hasConceptScore W4381942695C119857082 @default.
- W4381942695 hasConceptScore W4381942695C138885662 @default.
- W4381942695 hasConceptScore W4381942695C153180895 @default.
- W4381942695 hasConceptScore W4381942695C154945302 @default.
- W4381942695 hasConceptScore W4381942695C15744967 @default.
- W4381942695 hasConceptScore W4381942695C173201364 @default.
- W4381942695 hasConceptScore W4381942695C206310091 @default.
- W4381942695 hasConceptScore W4381942695C2776401178 @default.
- W4381942695 hasConceptScore W4381942695C28490314 @default.
- W4381942695 hasConceptScore W4381942695C41008148 @default.
- W4381942695 hasConceptScore W4381942695C41895202 @default.
- W4381942695 hasConceptScore W4381942695C522805319 @default.
- W4381942695 hasConceptScore W4381942695C52622490 @default.
- W4381942695 hasConceptScore W4381942695C81363708 @default.
- W4381942695 hasIssue "10" @default.
- W4381942695 hasLocation W43819426951 @default.
- W4381942695 hasOpenAccess W4381942695 @default.
- W4381942695 hasPrimaryLocation W43819426951 @default.
- W4381942695 hasRelatedWork W2059299633 @default.
- W4381942695 hasRelatedWork W2128739463 @default.
- W4381942695 hasRelatedWork W2384010565 @default.
- W4381942695 hasRelatedWork W2546942002 @default.
- W4381942695 hasRelatedWork W2732542196 @default.
- W4381942695 hasRelatedWork W2760085659 @default.
- W4381942695 hasRelatedWork W2940977206 @default.
- W4381942695 hasRelatedWork W2977314777 @default.