Matches in SemOpenAlex for { <https://semopenalex.org/work/W3165809733> ?p ?o ?g. }
- W3165809733 endingPage "5135" @default.
- W3165809733 startingPage "5116" @default.
- W3165809733 abstract "Speech signal processing is an active area of research, the most dominant source of exchanging information among human beings, and the best way for human–computer interaction (HCI). Human behavior assessments and emotion recognition from a speech signal, such as speech emotion recognition (SER) is an emerging HCI area of exploration with various real time claims. The performance of an efficient SER system depends on feature learning, which include salient and discriminative information such as high-level deep features. In this paper, we proposed a two-stream deep convolutional neural network with an iterative neighborhood component analysis (INCA) to learn mutually spatial-spectral features and select the most discriminative optimal features for the final prediction. Our model is composed of two channels, and each channel is associated with the convolutional neural network structure to extract cues from the oral signals. The first channel extracts feature from the spectral domain, and the second channel extracts features from the spatial domain, which are then fused and fed to the INCA to remove the severance and select the optimal features for the final model training. The joint refine features are passed from the fully connected network with a softmax classifier to yield the predictions of the different emotions. We trained our proposed system using three benchmarks, which included the EMO-DB, SAVEE, and RAVDESS emotional speech corpora, and we tested the prediction performance to secure 95%, 82%, and 85% recognition rates. The performance of the system shows the effectiveness and significance of the proposed system." @default.
- W3165809733 created "2021-06-07" @default.
- W3165809733 creator A5064081693 @default.
- W3165809733 creator A5075830673 @default.
- W3165809733 date "2021-05-26" @default.
- W3165809733 modified "2023-10-10" @default.
- W3165809733 title "Optimal feature selection based speech emotion recognition using two‐stream deep convolutional neural network" @default.
- W3165809733 cites W175750906 @default.
- W3165809733 cites W1966797434 @default.
- W3165809733 cites W2128171688 @default.
- W3165809733 cites W2144354855 @default.
- W3165809733 cites W2161073241 @default.
- W3165809733 cites W2183182206 @default.
- W3165809733 cites W2534377085 @default.
- W3165809733 cites W2599278603 @default.
- W3165809733 cites W2738561771 @default.
- W3165809733 cites W2766756589 @default.
- W3165809733 cites W2777468850 @default.
- W3165809733 cites W2795986449 @default.
- W3165809733 cites W2803193013 @default.
- W3165809733 cites W2810418809 @default.
- W3165809733 cites W2884739346 @default.
- W3165809733 cites W2885005742 @default.
- W3165809733 cites W2889717020 @default.
- W3165809733 cites W2901546194 @default.
- W3165809733 cites W2904938641 @default.
- W3165809733 cites W2905361499 @default.
- W3165809733 cites W2905917758 @default.
- W3165809733 cites W2910444986 @default.
- W3165809733 cites W2916113431 @default.
- W3165809733 cites W2919464470 @default.
- W3165809733 cites W2921024999 @default.
- W3165809733 cites W2936372954 @default.
- W3165809733 cites W2944697401 @default.
- W3165809733 cites W2951082691 @default.
- W3165809733 cites W2951123823 @default.
- W3165809733 cites W2959133507 @default.
- W3165809733 cites W2964370293 @default.
- W3165809733 cites W2967804986 @default.
- W3165809733 cites W2969889150 @default.
- W3165809733 cites W2970737019 @default.
- W3165809733 cites W2972811324 @default.
- W3165809733 cites W2980021808 @default.
- W3165809733 cites W2989473642 @default.
- W3165809733 cites W2994200508 @default.
- W3165809733 cites W2997399314 @default.
- W3165809733 cites W3008039831 @default.
- W3165809733 cites W3015449518 @default.
- W3165809733 cites W3019352575 @default.
- W3165809733 cites W3022013598 @default.
- W3165809733 cites W3022172938 @default.
- W3165809733 cites W3023726311 @default.
- W3165809733 cites W3025283630 @default.
- W3165809733 cites W3034330439 @default.
- W3165809733 cites W3038985617 @default.
- W3165809733 cites W3043308633 @default.
- W3165809733 cites W3084484668 @default.
- W3165809733 cites W3087484287 @default.
- W3165809733 cites W3087617596 @default.
- W3165809733 cites W3094173182 @default.
- W3165809733 cites W3095435288 @default.
- W3165809733 cites W3109961563 @default.
- W3165809733 cites W3118826611 @default.
- W3165809733 cites W3134450397 @default.
- W3165809733 cites W4254724182 @default.
- W3165809733 doi "https://doi.org/10.1002/int.22505" @default.
- W3165809733 hasPublicationYear "2021" @default.
- W3165809733 type Work @default.
- W3165809733 sameAs 3165809733 @default.
- W3165809733 citedByCount "50" @default.
- W3165809733 countsByYear W31658097332021 @default.
- W3165809733 countsByYear W31658097332022 @default.
- W3165809733 countsByYear W31658097332023 @default.
- W3165809733 crossrefType "journal-article" @default.
- W3165809733 hasAuthorship W3165809733A5064081693 @default.
- W3165809733 hasAuthorship W3165809733A5075830673 @default.
- W3165809733 hasBestOaLocation W31658097331 @default.
- W3165809733 hasConcept C108583219 @default.
- W3165809733 hasConcept C138885662 @default.
- W3165809733 hasConcept C148483581 @default.
- W3165809733 hasConcept C153180895 @default.
- W3165809733 hasConcept C154945302 @default.
- W3165809733 hasConcept C188441871 @default.
- W3165809733 hasConcept C2776401178 @default.
- W3165809733 hasConcept C2780719617 @default.
- W3165809733 hasConcept C28490314 @default.
- W3165809733 hasConcept C41008148 @default.
- W3165809733 hasConcept C41895202 @default.
- W3165809733 hasConcept C81363708 @default.
- W3165809733 hasConcept C95623464 @default.
- W3165809733 hasConcept C97931131 @default.
- W3165809733 hasConceptScore W3165809733C108583219 @default.
- W3165809733 hasConceptScore W3165809733C138885662 @default.
- W3165809733 hasConceptScore W3165809733C148483581 @default.
- W3165809733 hasConceptScore W3165809733C153180895 @default.
- W3165809733 hasConceptScore W3165809733C154945302 @default.
- W3165809733 hasConceptScore W3165809733C188441871 @default.
- W3165809733 hasConceptScore W3165809733C2776401178 @default.