Matches in SemOpenAlex for { <https://semopenalex.org/work/W4200537325> ?p ?o ?g. }
Showing items 1 to 81 of
81
with 100 items per page.
- W4200537325 endingPage "592" @default.
- W4200537325 startingPage "581" @default.
- W4200537325 abstract "This paper is about identifying emotions as a multimodal entity by implementation of machine learning techniques for acquiring as well as analysing EEG and speech signal. The fusion of EEG and speech-based emotion recognition method can be used to detect and recognize different states of emotions more accurately. This fusion of brain signal and voice signal model can be designed and implemented in brain–computer interface (BCI) to achieve emotions more natural. These processing methods are explored as fusion model to detect emotional states accurately. Many assumptions and algorithms are generally required to recognize different emotions as per literature survey, and the main focus for emotion recognition is based on single modality, such as voice, facial expression and bio signals. Humans can use numerous modalities mutually to predict emotional states in human behaviour, since emotion affects almost all modes. Hence, everyone expects higher recognition rates and accuracy with less duration through the integration of multiple modalities for analysis of emotion recognitions. In recent years, the researchers took keen interest from different fields for the effective analysis and recognition of emotions. Everyone designs different cognitive models by taking different machine learning techniques (MLT) using facial expressions, voice, behaviour (gesture/posture) and by using biosensors (EEG, ECG, EMG, EOG, etc.). Emotions play a major role to identify health conditions of human. To recognize human emotions by single modality is less accurate, because single modality method can be futile since humans may sometimes deliberately hide their real emotions. To design a cognitive model of emotion recognition by using voice and EEG signal with high accuracy to respond fluxes of affective states can provide useful features of emotional states. In this paper, the emotion recognition based on multi-channel EEG signals as well as voice signal is reviewed as per standard pipeline for emotion identification by using different feature extraction, feature selection and different machine learning techniques to find high accuracy of classes." @default.
- W4200537325 created "2021-12-31" @default.
- W4200537325 creator A5006518172 @default.
- W4200537325 creator A5071198389 @default.
- W4200537325 creator A5091534648 @default.
- W4200537325 date "2021-12-14" @default.
- W4200537325 modified "2023-09-30" @default.
- W4200537325 title "Designing Multimodal Cognitive Model of Emotion Recognition Using Voice and EEG Signal" @default.
- W4200537325 cites W1591198932 @default.
- W4200537325 cites W1601314345 @default.
- W4200537325 cites W2027519266 @default.
- W4200537325 cites W2083742331 @default.
- W4200537325 cites W2098844365 @default.
- W4200537325 cites W2114872534 @default.
- W4200537325 cites W2164777163 @default.
- W4200537325 cites W2485253924 @default.
- W4200537325 cites W2591423112 @default.
- W4200537325 cites W2613758800 @default.
- W4200537325 doi "https://doi.org/10.1007/978-981-16-2761-3_51" @default.
- W4200537325 hasPublicationYear "2021" @default.
- W4200537325 type Work @default.
- W4200537325 citedByCount "1" @default.
- W4200537325 countsByYear W42005373252022 @default.
- W4200537325 crossrefType "book-chapter" @default.
- W4200537325 hasAuthorship W4200537325A5006518172 @default.
- W4200537325 hasAuthorship W4200537325A5071198389 @default.
- W4200537325 hasAuthorship W4200537325A5091534648 @default.
- W4200537325 hasConcept C118552586 @default.
- W4200537325 hasConcept C144024400 @default.
- W4200537325 hasConcept C153180895 @default.
- W4200537325 hasConcept C154945302 @default.
- W4200537325 hasConcept C15744967 @default.
- W4200537325 hasConcept C169760540 @default.
- W4200537325 hasConcept C169900460 @default.
- W4200537325 hasConcept C195704467 @default.
- W4200537325 hasConcept C206310091 @default.
- W4200537325 hasConcept C207347870 @default.
- W4200537325 hasConcept C2777438025 @default.
- W4200537325 hasConcept C2779903281 @default.
- W4200537325 hasConcept C2780226545 @default.
- W4200537325 hasConcept C28490314 @default.
- W4200537325 hasConcept C36289849 @default.
- W4200537325 hasConcept C41008148 @default.
- W4200537325 hasConcept C522805319 @default.
- W4200537325 hasConcept C6438553 @default.
- W4200537325 hasConceptScore W4200537325C118552586 @default.
- W4200537325 hasConceptScore W4200537325C144024400 @default.
- W4200537325 hasConceptScore W4200537325C153180895 @default.
- W4200537325 hasConceptScore W4200537325C154945302 @default.
- W4200537325 hasConceptScore W4200537325C15744967 @default.
- W4200537325 hasConceptScore W4200537325C169760540 @default.
- W4200537325 hasConceptScore W4200537325C169900460 @default.
- W4200537325 hasConceptScore W4200537325C195704467 @default.
- W4200537325 hasConceptScore W4200537325C206310091 @default.
- W4200537325 hasConceptScore W4200537325C207347870 @default.
- W4200537325 hasConceptScore W4200537325C2777438025 @default.
- W4200537325 hasConceptScore W4200537325C2779903281 @default.
- W4200537325 hasConceptScore W4200537325C2780226545 @default.
- W4200537325 hasConceptScore W4200537325C28490314 @default.
- W4200537325 hasConceptScore W4200537325C36289849 @default.
- W4200537325 hasConceptScore W4200537325C41008148 @default.
- W4200537325 hasConceptScore W4200537325C522805319 @default.
- W4200537325 hasConceptScore W4200537325C6438553 @default.
- W4200537325 hasLocation W42005373251 @default.
- W4200537325 hasOpenAccess W4200537325 @default.
- W4200537325 hasPrimaryLocation W42005373251 @default.
- W4200537325 hasRelatedWork W2143350951 @default.
- W4200537325 hasRelatedWork W2899077601 @default.
- W4200537325 hasRelatedWork W2981754449 @default.
- W4200537325 hasRelatedWork W3156860372 @default.
- W4200537325 hasRelatedWork W3169597903 @default.
- W4200537325 hasRelatedWork W3188183700 @default.
- W4200537325 hasRelatedWork W3188190284 @default.
- W4200537325 hasRelatedWork W4200537325 @default.
- W4200537325 hasRelatedWork W4290996278 @default.
- W4200537325 hasRelatedWork W4366374509 @default.
- W4200537325 isParatext "false" @default.
- W4200537325 isRetracted "false" @default.
- W4200537325 workType "book-chapter" @default.