Matches in SemOpenAlex for { <https://semopenalex.org/work/W3156860372> ?p ?o ?g. }
Showing items 1 to 87 of
87
with 100 items per page.
- W3156860372 endingPage "2327" @default.
- W3156860372 startingPage "2315" @default.
- W3156860372 abstract "With the rapid development of deep learning and artificial intelligence, affective computing, as a branch field, has attracted increasing research attention. Human emotions are diverse and are directly expressed via non-physiological indicators, such as electroencephalogram (EEG) signals. However, whether emotion-based or EEG-based, these remain single-modes of emotion recognition. Multi-mode fusion emotion recognition can improve accuracy by utilizing feature diversity and correlation. Therefore, three different models have been established: the single-mode-based EEG-long and short-term memory (LSTM) model, the Facial-LSTM model based on facial expressions processing EEG data, and the multi-mode LSTM-convolutional neural network (CNN) model that combines expressions and EEG. Their average classification accuracy was 86.48%, 89.42%, and 93.13%, respectively. Compared with the EEG-LSTM model, the Facial-LSTM model improved by about 3%. This indicated that the expression mode helped eliminate EEG signals that contained few or no emotional features, enhancing emotion recognition accuracy. Compared with the Facial-LSTM model, the classification accuracy of the LSTM-CNN model improved by 3.7%, showing that the addition of facial expressions affected the EEG features to a certain extent. Therefore, using various modal features for emotion recognition conforms to human emotional expression. Furthermore, it improves feature diversity to facilitate further emotion recognition research." @default.
- W3156860372 created "2021-04-26" @default.
- W3156860372 creator A5010168169 @default.
- W3156860372 creator A5063798691 @default.
- W3156860372 creator A5073361851 @default.
- W3156860372 creator A5084759684 @default.
- W3156860372 date "2021-01-01" @default.
- W3156860372 modified "2023-10-17" @default.
- W3156860372 title "Emotion Analysis: Bimodal Fusion of Facial Expressions and EEG" @default.
- W3156860372 cites W2079735306 @default.
- W3156860372 cites W2621864722 @default.
- W3156860372 cites W2755014019 @default.
- W3156860372 cites W2765856398 @default.
- W3156860372 cites W2970405962 @default.
- W3156860372 cites W2980035261 @default.
- W3156860372 cites W2983888960 @default.
- W3156860372 cites W2989989812 @default.
- W3156860372 cites W3007574032 @default.
- W3156860372 cites W3035471470 @default.
- W3156860372 cites W3080693455 @default.
- W3156860372 cites W3127179589 @default.
- W3156860372 cites W4206073726 @default.
- W3156860372 doi "https://doi.org/10.32604/cmc.2021.016832" @default.
- W3156860372 hasPublicationYear "2021" @default.
- W3156860372 type Work @default.
- W3156860372 sameAs 3156860372 @default.
- W3156860372 citedByCount "3" @default.
- W3156860372 countsByYear W31568603722022 @default.
- W3156860372 crossrefType "journal-article" @default.
- W3156860372 hasAuthorship W3156860372A5010168169 @default.
- W3156860372 hasAuthorship W3156860372A5063798691 @default.
- W3156860372 hasAuthorship W3156860372A5073361851 @default.
- W3156860372 hasAuthorship W3156860372A5084759684 @default.
- W3156860372 hasBestOaLocation W31568603721 @default.
- W3156860372 hasConcept C108583219 @default.
- W3156860372 hasConcept C138885662 @default.
- W3156860372 hasConcept C153180895 @default.
- W3156860372 hasConcept C154945302 @default.
- W3156860372 hasConcept C15744967 @default.
- W3156860372 hasConcept C169760540 @default.
- W3156860372 hasConcept C195704467 @default.
- W3156860372 hasConcept C206310091 @default.
- W3156860372 hasConcept C2776401178 @default.
- W3156860372 hasConcept C2777438025 @default.
- W3156860372 hasConcept C28490314 @default.
- W3156860372 hasConcept C41008148 @default.
- W3156860372 hasConcept C41895202 @default.
- W3156860372 hasConcept C522805319 @default.
- W3156860372 hasConcept C6438553 @default.
- W3156860372 hasConcept C81363708 @default.
- W3156860372 hasConceptScore W3156860372C108583219 @default.
- W3156860372 hasConceptScore W3156860372C138885662 @default.
- W3156860372 hasConceptScore W3156860372C153180895 @default.
- W3156860372 hasConceptScore W3156860372C154945302 @default.
- W3156860372 hasConceptScore W3156860372C15744967 @default.
- W3156860372 hasConceptScore W3156860372C169760540 @default.
- W3156860372 hasConceptScore W3156860372C195704467 @default.
- W3156860372 hasConceptScore W3156860372C206310091 @default.
- W3156860372 hasConceptScore W3156860372C2776401178 @default.
- W3156860372 hasConceptScore W3156860372C2777438025 @default.
- W3156860372 hasConceptScore W3156860372C28490314 @default.
- W3156860372 hasConceptScore W3156860372C41008148 @default.
- W3156860372 hasConceptScore W3156860372C41895202 @default.
- W3156860372 hasConceptScore W3156860372C522805319 @default.
- W3156860372 hasConceptScore W3156860372C6438553 @default.
- W3156860372 hasConceptScore W3156860372C81363708 @default.
- W3156860372 hasIssue "2" @default.
- W3156860372 hasLocation W31568603721 @default.
- W3156860372 hasOpenAccess W3156860372 @default.
- W3156860372 hasPrimaryLocation W31568603721 @default.
- W3156860372 hasRelatedWork W2899077601 @default.
- W3156860372 hasRelatedWork W2942994697 @default.
- W3156860372 hasRelatedWork W2990866961 @default.
- W3156860372 hasRelatedWork W3020496054 @default.
- W3156860372 hasRelatedWork W3033658423 @default.
- W3156860372 hasRelatedWork W3118234491 @default.
- W3156860372 hasRelatedWork W3169597903 @default.
- W3156860372 hasRelatedWork W3180630304 @default.
- W3156860372 hasRelatedWork W4214561993 @default.
- W3156860372 hasRelatedWork W4313182991 @default.
- W3156860372 hasVolume "68" @default.
- W3156860372 isParatext "false" @default.
- W3156860372 isRetracted "false" @default.
- W3156860372 magId "3156860372" @default.
- W3156860372 workType "article" @default.