Matches in SemOpenAlex for { <https://semopenalex.org/work/W4200165771> ?p ?o ?g. }
- W4200165771 abstract "People perceive emotions via multiple cues, predominantly speech and visual cues, and a number of emotion recognition systems utilize both audio and visual cues. Moreover, the perception of static aspects of emotion (speaker's arousal level is high/low) and the dynamic aspects of emotion (speaker is becoming more aroused) might be perceived via different expressive cues and these two aspects are integrated to provide a unified sense of emotion state. However, existing multimodal systems only focus on single aspect of emotion perception and the contributions of different modalities toward modeling static and dynamic emotion aspects are not well explored. In this paper, we investigate the relative salience of audio and video modalities to emotion state prediction and emotion change prediction using a Multimodal Markovian affect model. Experiments conducted in the RECOLA database showed that audio modality is better at modeling the emotion state of arousal and video for emotion state of valence, whereas audio shows superior advantages over video in modeling emotion changes for both arousal and valence." @default.
- W4200165771 created "2021-12-31" @default.
- W4200165771 creator A5028116210 @default.
- W4200165771 creator A5032689109 @default.
- W4200165771 creator A5046915220 @default.
- W4200165771 creator A5071116593 @default.
- W4200165771 date "2021-12-23" @default.
- W4200165771 modified "2023-10-18" @default.
- W4200165771 title "Multimodal Affect Models: An Investigation of Relative Salience of Audio and Visual Cues for Emotion Prediction" @default.
- W4200165771 cites W1963947028 @default.
- W4200165771 cites W1976235033 @default.
- W4200165771 cites W1993008008 @default.
- W4200165771 cites W2002658919 @default.
- W4200165771 cites W2024694940 @default.
- W4200165771 cites W2025905516 @default.
- W4200165771 cites W2032254851 @default.
- W4200165771 cites W2037789405 @default.
- W4200165771 cites W2045528981 @default.
- W4200165771 cites W2047221353 @default.
- W4200165771 cites W2055332436 @default.
- W4200165771 cites W2067906953 @default.
- W4200165771 cites W2069924379 @default.
- W4200165771 cites W2085662862 @default.
- W4200165771 cites W2102548748 @default.
- W4200165771 cites W2118789253 @default.
- W4200165771 cites W2123506034 @default.
- W4200165771 cites W2142384583 @default.
- W4200165771 cites W2149628368 @default.
- W4200165771 cites W2152627593 @default.
- W4200165771 cites W2158630797 @default.
- W4200165771 cites W2160304657 @default.
- W4200165771 cites W2163094209 @default.
- W4200165771 cites W2166118664 @default.
- W4200165771 cites W2239141610 @default.
- W4200165771 cites W2314395941 @default.
- W4200165771 cites W2346454595 @default.
- W4200165771 cites W2396294578 @default.
- W4200165771 cites W2403257023 @default.
- W4200165771 cites W2404446881 @default.
- W4200165771 cites W2592183475 @default.
- W4200165771 cites W2673304402 @default.
- W4200165771 cites W2794094995 @default.
- W4200165771 cites W2884739346 @default.
- W4200165771 cites W2889113107 @default.
- W4200165771 cites W2891187402 @default.
- W4200165771 cites W2900358852 @default.
- W4200165771 cites W2900865875 @default.
- W4200165771 cites W2901531394 @default.
- W4200165771 cites W2997399314 @default.
- W4200165771 cites W3001645704 @default.
- W4200165771 cites W3016153892 @default.
- W4200165771 cites W3136890664 @default.
- W4200165771 cites W3137028092 @default.
- W4200165771 cites W4230277160 @default.
- W4200165771 cites W4239181501 @default.
- W4200165771 doi "https://doi.org/10.3389/fcomp.2021.767767" @default.
- W4200165771 hasPublicationYear "2021" @default.
- W4200165771 type Work @default.
- W4200165771 citedByCount "1" @default.
- W4200165771 countsByYear W42001657712022 @default.
- W4200165771 crossrefType "journal-article" @default.
- W4200165771 hasAuthorship W4200165771A5028116210 @default.
- W4200165771 hasAuthorship W4200165771A5032689109 @default.
- W4200165771 hasAuthorship W4200165771A5046915220 @default.
- W4200165771 hasAuthorship W4200165771A5071116593 @default.
- W4200165771 hasBestOaLocation W42001657711 @default.
- W4200165771 hasConcept C108154423 @default.
- W4200165771 hasConcept C121332964 @default.
- W4200165771 hasConcept C126863065 @default.
- W4200165771 hasConcept C128534915 @default.
- W4200165771 hasConcept C144024400 @default.
- W4200165771 hasConcept C15744967 @default.
- W4200165771 hasConcept C168900304 @default.
- W4200165771 hasConcept C169760540 @default.
- W4200165771 hasConcept C180747234 @default.
- W4200165771 hasConcept C206310091 @default.
- W4200165771 hasConcept C26760741 @default.
- W4200165771 hasConcept C2776035688 @default.
- W4200165771 hasConcept C2776141551 @default.
- W4200165771 hasConcept C2777438025 @default.
- W4200165771 hasConcept C2779903281 @default.
- W4200165771 hasConcept C28490314 @default.
- W4200165771 hasConcept C36289849 @default.
- W4200165771 hasConcept C36951298 @default.
- W4200165771 hasConcept C41008148 @default.
- W4200165771 hasConcept C46312422 @default.
- W4200165771 hasConcept C62520636 @default.
- W4200165771 hasConcept C77805123 @default.
- W4200165771 hasConceptScore W4200165771C108154423 @default.
- W4200165771 hasConceptScore W4200165771C121332964 @default.
- W4200165771 hasConceptScore W4200165771C126863065 @default.
- W4200165771 hasConceptScore W4200165771C128534915 @default.
- W4200165771 hasConceptScore W4200165771C144024400 @default.
- W4200165771 hasConceptScore W4200165771C15744967 @default.
- W4200165771 hasConceptScore W4200165771C168900304 @default.
- W4200165771 hasConceptScore W4200165771C169760540 @default.
- W4200165771 hasConceptScore W4200165771C180747234 @default.
- W4200165771 hasConceptScore W4200165771C206310091 @default.
- W4200165771 hasConceptScore W4200165771C26760741 @default.
- W4200165771 hasConceptScore W4200165771C2776035688 @default.