Matches in SemOpenAlex for { <https://semopenalex.org/work/W4385597818> ?p ?o ?g. }
- W4385597818 endingPage "126649" @default.
- W4385597818 startingPage "126649" @default.
- W4385597818 abstract "With new developments in the field of human-computer interaction, researchers are now paying attention to emotion recognition, especially multimodal emotion recognition, as emotion is a multidimensional expression. In this study, we propose a multimodal fusion emotion recognition method (MTL-BAM) based on multitask learning and the attention mechanism to tackle the major problems encountered in multimodal emotion recognition tasks regarding the lack of consideration of emotion interactions among modalities and the focus on emotion similarity among modalities while ignoring the differences. By improving the attention mechanism, the emotional contribution of each modality is further analyzed so that the emotional representations of each modality can learn from and complement each other to achieve better interactive fusion effect, thereby building a multitask learning framework. By introducing three types of monomodal emotion recognition tasks as auxiliary tasks, the model can detect emotion differences. Simultaneously, the label generation unit is introduced into the auxiliary tasks, and the monomodal emotion label value can be obtained more accurately through two proportional formulas while preventing the zero value problem. Our results show that the proposed method outperforms selected state-of-the-art methods on four evaluation indexes of emotion classification (i.e., accuracy, F1 score, MAE, and Pearson correlation coefficient). The proposed method achieved accuracy rates of 85.36% and 84.61% on the published multimodal datasets of CMU-MOSI and CMU-MOSEI, respectively, which are 2–6% higher than those of existing state-of-the-art models, demonstrating good multimodal emotion recognition performance and strong generalizability." @default.
- W4385597818 created "2023-08-05" @default.
- W4385597818 creator A5001105055 @default.
- W4385597818 creator A5011166976 @default.
- W4385597818 creator A5011922533 @default.
- W4385597818 creator A5025339842 @default.
- W4385597818 creator A5037251000 @default.
- W4385597818 creator A5041966171 @default.
- W4385597818 creator A5047428032 @default.
- W4385597818 date "2023-11-01" @default.
- W4385597818 modified "2023-10-16" @default.
- W4385597818 title "A multimodal fusion emotion recognition method based on multitask learning and attention mechanism" @default.
- W4385597818 cites W2061116763 @default.
- W4385597818 cites W2343758848 @default.
- W4385597818 cites W2584561145 @default.
- W4385597818 cites W2772633765 @default.
- W4385597818 cites W2787322890 @default.
- W4385597818 cites W2787581402 @default.
- W4385597818 cites W2962770129 @default.
- W4385597818 cites W2962931510 @default.
- W4385597818 cites W2963032608 @default.
- W4385597818 cites W2963104701 @default.
- W4385597818 cites W2964010806 @default.
- W4385597818 cites W2964051877 @default.
- W4385597818 cites W2997258743 @default.
- W4385597818 cites W3044854652 @default.
- W4385597818 cites W3089557188 @default.
- W4385597818 cites W3128412859 @default.
- W4385597818 cites W3135148659 @default.
- W4385597818 cites W3146366485 @default.
- W4385597818 cites W3198817488 @default.
- W4385597818 cites W4224916778 @default.
- W4385597818 doi "https://doi.org/10.1016/j.neucom.2023.126649" @default.
- W4385597818 hasPublicationYear "2023" @default.
- W4385597818 type Work @default.
- W4385597818 citedByCount "0" @default.
- W4385597818 crossrefType "journal-article" @default.
- W4385597818 hasAuthorship W4385597818A5001105055 @default.
- W4385597818 hasAuthorship W4385597818A5011166976 @default.
- W4385597818 hasAuthorship W4385597818A5011922533 @default.
- W4385597818 hasAuthorship W4385597818A5025339842 @default.
- W4385597818 hasAuthorship W4385597818A5037251000 @default.
- W4385597818 hasAuthorship W4385597818A5041966171 @default.
- W4385597818 hasAuthorship W4385597818A5047428032 @default.
- W4385597818 hasConcept C103278499 @default.
- W4385597818 hasConcept C111472728 @default.
- W4385597818 hasConcept C115961682 @default.
- W4385597818 hasConcept C119857082 @default.
- W4385597818 hasConcept C138496976 @default.
- W4385597818 hasConcept C138885662 @default.
- W4385597818 hasConcept C144024400 @default.
- W4385597818 hasConcept C154945302 @default.
- W4385597818 hasConcept C15744967 @default.
- W4385597818 hasConcept C162324750 @default.
- W4385597818 hasConcept C187736073 @default.
- W4385597818 hasConcept C27158222 @default.
- W4385597818 hasConcept C2777438025 @default.
- W4385597818 hasConcept C2779903281 @default.
- W4385597818 hasConcept C2780226545 @default.
- W4385597818 hasConcept C2780451532 @default.
- W4385597818 hasConcept C2780660688 @default.
- W4385597818 hasConcept C36289849 @default.
- W4385597818 hasConcept C41008148 @default.
- W4385597818 hasConcept C89611455 @default.
- W4385597818 hasConceptScore W4385597818C103278499 @default.
- W4385597818 hasConceptScore W4385597818C111472728 @default.
- W4385597818 hasConceptScore W4385597818C115961682 @default.
- W4385597818 hasConceptScore W4385597818C119857082 @default.
- W4385597818 hasConceptScore W4385597818C138496976 @default.
- W4385597818 hasConceptScore W4385597818C138885662 @default.
- W4385597818 hasConceptScore W4385597818C144024400 @default.
- W4385597818 hasConceptScore W4385597818C154945302 @default.
- W4385597818 hasConceptScore W4385597818C15744967 @default.
- W4385597818 hasConceptScore W4385597818C162324750 @default.
- W4385597818 hasConceptScore W4385597818C187736073 @default.
- W4385597818 hasConceptScore W4385597818C27158222 @default.
- W4385597818 hasConceptScore W4385597818C2777438025 @default.
- W4385597818 hasConceptScore W4385597818C2779903281 @default.
- W4385597818 hasConceptScore W4385597818C2780226545 @default.
- W4385597818 hasConceptScore W4385597818C2780451532 @default.
- W4385597818 hasConceptScore W4385597818C2780660688 @default.
- W4385597818 hasConceptScore W4385597818C36289849 @default.
- W4385597818 hasConceptScore W4385597818C41008148 @default.
- W4385597818 hasConceptScore W4385597818C89611455 @default.
- W4385597818 hasLocation W43855978181 @default.
- W4385597818 hasOpenAccess W4385597818 @default.
- W4385597818 hasPrimaryLocation W43855978181 @default.
- W4385597818 hasRelatedWork W2012466265 @default.
- W4385597818 hasRelatedWork W2613123485 @default.
- W4385597818 hasRelatedWork W2904518532 @default.
- W4385597818 hasRelatedWork W2914599329 @default.
- W4385597818 hasRelatedWork W2962931510 @default.
- W4385597818 hasRelatedWork W3200817606 @default.
- W4385597818 hasRelatedWork W4205137593 @default.
- W4385597818 hasRelatedWork W4236838349 @default.
- W4385597818 hasRelatedWork W4290996278 @default.
- W4385597818 hasRelatedWork W4380551887 @default.
- W4385597818 hasVolume "556" @default.