Matches in SemOpenAlex for { <https://semopenalex.org/work/W4367312493> ?p ?o ?g. }
- W4367312493 endingPage "4373" @default.
- W4367312493 startingPage "4373" @default.
- W4367312493 abstract "Multimodal emotion recognition has gained much traction in the field of affective computing, human-computer interaction (HCI), artificial intelligence (AI), and user experience (UX). There is growing demand to automate analysis of user emotion towards HCI, AI, and UX evaluation applications for providing affective services. Emotions are increasingly being used, obtained through the videos, audio, text or physiological signals. This has led to process emotions from multiple modalities, usually combined through ensemble-based systems with static weights. Due to numerous limitations like missing modality data, inter-class variations, and intra-class similarities, an effective weighting scheme is thus required to improve the aforementioned discrimination between modalities. This article takes into account the importance of difference between multiple modalities and assigns dynamic weights to them by adapting a more efficient combination process with the application of generalized mixture (GM) functions. Therefore, we present a hybrid multimodal emotion recognition (H-MMER) framework using multi-view learning approach for unimodal emotion recognition and introducing multimodal feature fusion level, and decision level fusion using GM functions. In an experimental study, we evaluated the ability of our proposed framework to model a set of four different emotional states (Happiness, Neutral, Sadness, and Anger) and found that most of them can be modeled well with significantly high accuracy using GM functions. The experiment shows that the proposed framework can model emotional states with an average accuracy of 98.19% and indicates significant gain in terms of performance in contrast to traditional approaches. The overall evaluation results indicate that we can identify emotional states with high accuracy and increase the robustness of an emotion classification system required for UX measurement." @default.
- W4367312493 created "2023-04-29" @default.
- W4367312493 creator A5011141471 @default.
- W4367312493 creator A5015667504 @default.
- W4367312493 creator A5032252674 @default.
- W4367312493 creator A5037717724 @default.
- W4367312493 creator A5046166174 @default.
- W4367312493 creator A5065010121 @default.
- W4367312493 creator A5068838719 @default.
- W4367312493 creator A5074362717 @default.
- W4367312493 creator A5074874309 @default.
- W4367312493 date "2023-04-28" @default.
- W4367312493 modified "2023-09-30" @default.
- W4367312493 title "A Hybrid Multimodal Emotion Recognition Framework for UX Evaluation Using Generalized Mixture Functions" @default.
- W4367312493 cites W2077662469 @default.
- W4367312493 cites W2194775991 @default.
- W4367312493 cites W2290694504 @default.
- W4367312493 cites W2296349740 @default.
- W4367312493 cites W2417999172 @default.
- W4367312493 cites W2465165615 @default.
- W4367312493 cites W2518937691 @default.
- W4367312493 cites W2584561145 @default.
- W4367312493 cites W2619383789 @default.
- W4367312493 cites W2709480298 @default.
- W4367312493 cites W2745497104 @default.
- W4367312493 cites W2766272105 @default.
- W4367312493 cites W2766925079 @default.
- W4367312493 cites W2777460464 @default.
- W4367312493 cites W2783274584 @default.
- W4367312493 cites W2803193013 @default.
- W4367312493 cites W2804636961 @default.
- W4367312493 cites W2805766156 @default.
- W4367312493 cites W2889466822 @default.
- W4367312493 cites W2900270292 @default.
- W4367312493 cites W2917572534 @default.
- W4367312493 cites W2936772191 @default.
- W4367312493 cites W2938404524 @default.
- W4367312493 cites W2946490545 @default.
- W4367312493 cites W2957628672 @default.
- W4367312493 cites W2964346351 @default.
- W4367312493 cites W2972037886 @default.
- W4367312493 cites W2978360381 @default.
- W4367312493 cites W3003908700 @default.
- W4367312493 cites W3007783920 @default.
- W4367312493 cites W3009563099 @default.
- W4367312493 cites W3026901649 @default.
- W4367312493 cites W3043118946 @default.
- W4367312493 cites W3084624816 @default.
- W4367312493 cites W3159301005 @default.
- W4367312493 cites W3186192207 @default.
- W4367312493 cites W3216427062 @default.
- W4367312493 cites W4210391670 @default.
- W4367312493 cites W4220829848 @default.
- W4367312493 cites W4220887861 @default.
- W4367312493 cites W4224286641 @default.
- W4367312493 cites W4293386309 @default.
- W4367312493 cites W4296474212 @default.
- W4367312493 cites W4297841880 @default.
- W4367312493 cites W4318561946 @default.
- W4367312493 cites W4319455989 @default.
- W4367312493 cites W4320002812 @default.
- W4367312493 cites W4328007250 @default.
- W4367312493 doi "https://doi.org/10.3390/s23094373" @default.
- W4367312493 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/37177574" @default.
- W4367312493 hasPublicationYear "2023" @default.
- W4367312493 type Work @default.
- W4367312493 citedByCount "0" @default.
- W4367312493 crossrefType "journal-article" @default.
- W4367312493 hasAuthorship W4367312493A5011141471 @default.
- W4367312493 hasAuthorship W4367312493A5015667504 @default.
- W4367312493 hasAuthorship W4367312493A5032252674 @default.
- W4367312493 hasAuthorship W4367312493A5037717724 @default.
- W4367312493 hasAuthorship W4367312493A5046166174 @default.
- W4367312493 hasAuthorship W4367312493A5065010121 @default.
- W4367312493 hasAuthorship W4367312493A5068838719 @default.
- W4367312493 hasAuthorship W4367312493A5074362717 @default.
- W4367312493 hasAuthorship W4367312493A5074874309 @default.
- W4367312493 hasBestOaLocation W43673124931 @default.
- W4367312493 hasConcept C111919701 @default.
- W4367312493 hasConcept C118552586 @default.
- W4367312493 hasConcept C119857082 @default.
- W4367312493 hasConcept C126838900 @default.
- W4367312493 hasConcept C138885662 @default.
- W4367312493 hasConcept C144024400 @default.
- W4367312493 hasConcept C14855644 @default.
- W4367312493 hasConcept C154945302 @default.
- W4367312493 hasConcept C15744967 @default.
- W4367312493 hasConcept C169760540 @default.
- W4367312493 hasConcept C169900460 @default.
- W4367312493 hasConcept C183115368 @default.
- W4367312493 hasConcept C206310091 @default.
- W4367312493 hasConcept C2776401178 @default.
- W4367312493 hasConcept C2776502983 @default.
- W4367312493 hasConcept C2777438025 @default.
- W4367312493 hasConcept C2778999518 @default.
- W4367312493 hasConcept C2779302386 @default.
- W4367312493 hasConcept C2779812673 @default.
- W4367312493 hasConcept C2779903281 @default.