Matches in SemOpenAlex for { <https://semopenalex.org/work/W4313542967> ?p ?o ?g. }
Showing items 1 to 100 of
100
with 100 items per page.
- W4313542967 endingPage "18" @default.
- W4313542967 startingPage "1" @default.
- W4313542967 abstract "Humans express their emotions in a variety of ways, which inspires research on multimodal fusion-based emotion recognition that utilizes different modalities to achieve information complementation. However, extracting deep emotional features from different modalities and fusing them remain a challenging task. It is essential to exploit the advantages of different extraction and fusion approaches to capture the emotional information contained within and across modalities. In this paper, we present a novel multimodal emotion recognition framework called multimodal emotion recognition based on cascaded multichannel and hierarchical fusion (CMC-HF), where visual, speech, and text signals are simultaneously utilized as multimodal inputs. First, three cascaded channels based on deep learning technology perform feature extraction for the three modalities separately to enhance deeper information extraction ability within each modality and improve recognition performance. Second, an improved hierarchical fusion module is introduced to promote intermodality interactions of three modalities and further improve recognition and classification accuracy. Finally, to validate the effectiveness of the designed CMC-HF model, some experiments are conducted to evaluate two benchmark datasets, IEMOCAP and CMU-MOSI. The results show that we achieved an almost 2%∼3.2% increase in accuracy of the four classes for the IEMOCAP dataset as well as an improvement of 0.9%∼2.5% in the average class accuracy for the CMU-MOSI dataset when compared to the existing state-of-the-art methods. The ablation experimental results indicate that the cascaded feature extraction method and the hierarchical fusion method make a significant contribution to multimodal emotion recognition, suggesting that the three modalities contain deeper information interactions of both intermodality and intramodality. Hence, the proposed model has better overall performance and achieves higher recognition efficiency and better robustness." @default.
- W4313542967 created "2023-01-06" @default.
- W4313542967 creator A5003779651 @default.
- W4313542967 creator A5045005999 @default.
- W4313542967 creator A5090907346 @default.
- W4313542967 date "2023-01-05" @default.
- W4313542967 modified "2023-10-05" @default.
- W4313542967 title "Multimodal Emotion Recognition Based on Cascaded Multichannel and Hierarchical Fusion" @default.
- W4313542967 cites W2083543775 @default.
- W4313542967 cites W2099813784 @default.
- W4313542967 cites W2164471543 @default.
- W4313542967 cites W2600389231 @default.
- W4313542967 cites W2621864722 @default.
- W4313542967 cites W2658460662 @default.
- W4313542967 cites W2787581402 @default.
- W4313542967 cites W2889386526 @default.
- W4313542967 cites W2962736520 @default.
- W4313542967 cites W2962931510 @default.
- W4313542967 cites W2963702064 @default.
- W4313542967 cites W2963710346 @default.
- W4313542967 cites W2964051877 @default.
- W4313542967 cites W2964216663 @default.
- W4313542967 cites W3088631780 @default.
- W4313542967 cites W3107569919 @default.
- W4313542967 cites W3163470448 @default.
- W4313542967 cites W3174977508 @default.
- W4313542967 cites W3210120707 @default.
- W4313542967 cites W4229026033 @default.
- W4313542967 cites W4294170691 @default.
- W4313542967 cites W4297518389 @default.
- W4313542967 cites W4297899454 @default.
- W4313542967 cites W4309426823 @default.
- W4313542967 cites W4312015797 @default.
- W4313542967 doi "https://doi.org/10.1155/2023/9645611" @default.
- W4313542967 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/36643891" @default.
- W4313542967 hasPublicationYear "2023" @default.
- W4313542967 type Work @default.
- W4313542967 citedByCount "2" @default.
- W4313542967 countsByYear W43135429672023 @default.
- W4313542967 crossrefType "journal-article" @default.
- W4313542967 hasAuthorship W4313542967A5003779651 @default.
- W4313542967 hasAuthorship W4313542967A5045005999 @default.
- W4313542967 hasAuthorship W4313542967A5090907346 @default.
- W4313542967 hasBestOaLocation W43135429671 @default.
- W4313542967 hasConcept C119857082 @default.
- W4313542967 hasConcept C13280743 @default.
- W4313542967 hasConcept C138885662 @default.
- W4313542967 hasConcept C144024400 @default.
- W4313542967 hasConcept C153180895 @default.
- W4313542967 hasConcept C154945302 @default.
- W4313542967 hasConcept C185798385 @default.
- W4313542967 hasConcept C205649164 @default.
- W4313542967 hasConcept C2776401178 @default.
- W4313542967 hasConcept C2777438025 @default.
- W4313542967 hasConcept C2779903281 @default.
- W4313542967 hasConcept C2780226545 @default.
- W4313542967 hasConcept C28490314 @default.
- W4313542967 hasConcept C36289849 @default.
- W4313542967 hasConcept C41008148 @default.
- W4313542967 hasConcept C41895202 @default.
- W4313542967 hasConcept C52622490 @default.
- W4313542967 hasConceptScore W4313542967C119857082 @default.
- W4313542967 hasConceptScore W4313542967C13280743 @default.
- W4313542967 hasConceptScore W4313542967C138885662 @default.
- W4313542967 hasConceptScore W4313542967C144024400 @default.
- W4313542967 hasConceptScore W4313542967C153180895 @default.
- W4313542967 hasConceptScore W4313542967C154945302 @default.
- W4313542967 hasConceptScore W4313542967C185798385 @default.
- W4313542967 hasConceptScore W4313542967C205649164 @default.
- W4313542967 hasConceptScore W4313542967C2776401178 @default.
- W4313542967 hasConceptScore W4313542967C2777438025 @default.
- W4313542967 hasConceptScore W4313542967C2779903281 @default.
- W4313542967 hasConceptScore W4313542967C2780226545 @default.
- W4313542967 hasConceptScore W4313542967C28490314 @default.
- W4313542967 hasConceptScore W4313542967C36289849 @default.
- W4313542967 hasConceptScore W4313542967C41008148 @default.
- W4313542967 hasConceptScore W4313542967C41895202 @default.
- W4313542967 hasConceptScore W4313542967C52622490 @default.
- W4313542967 hasFunder F4320335777 @default.
- W4313542967 hasLocation W43135429671 @default.
- W4313542967 hasLocation W43135429672 @default.
- W4313542967 hasLocation W43135429673 @default.
- W4313542967 hasOpenAccess W4313542967 @default.
- W4313542967 hasPrimaryLocation W43135429671 @default.
- W4313542967 hasRelatedWork W1964120219 @default.
- W4313542967 hasRelatedWork W2000165426 @default.
- W4313542967 hasRelatedWork W2144059113 @default.
- W4313542967 hasRelatedWork W2146076056 @default.
- W4313542967 hasRelatedWork W2385132419 @default.
- W4313542967 hasRelatedWork W2546942002 @default.
- W4313542967 hasRelatedWork W2772780115 @default.
- W4313542967 hasRelatedWork W2811390910 @default.
- W4313542967 hasRelatedWork W3003836766 @default.
- W4313542967 hasRelatedWork W3176765321 @default.
- W4313542967 hasVolume "2023" @default.
- W4313542967 isParatext "false" @default.
- W4313542967 isRetracted "false" @default.
- W4313542967 workType "article" @default.