Matches in SemOpenAlex for { <https://semopenalex.org/work/W2962499391> ?p ?o ?g. }
Showing items 1 to 87 of
87
with 100 items per page.
- W2962499391 abstract "Emotion forecasting is the task of predicting the future emotion of a speaker-i.e., the emotion label of the future speaking turn-based on the speaker's past and current audiovisual cues. Emotion forecasting systems require new problem formulations that differ from traditional emotion recognition systems. In this paper, we first explore two types of forecasting windows (i.e., analysis windows for which the speaker's emotion is being forecasted): utterance forecasting and time forecasting. Utterance forecasting is based on speaking turns and forecasts what the speaker's emotion will be after one, two, or three speaking turns. Time forecasting forecasts what the speaker's emotion will be after a certain range of time, such as 3-8, 8- 13, and 13-18 seconds. We then investigate the benefit of using the past audio-visual cues in addition to the current utterance. We design emotion forecasting models using deep learning. We compare the performances of fully-connected deep neural network (FC-DNN), deep long short-term memory (D-LSTM), and deep bidirectional long short-term memory (D-BLSTM) recurrent neural networks (RNNs). This allows us to examine the benefit of modeling dynamic patterns in emotion forecasting tasks. Our experimental results on the IEMOCAP benchmark dataset demonstrate that D-BLSTM and D-LSTM outperform FC-DNN by up to 2.42% in unweighted recall. When using both the current and past utterances, deep dynamic models show an improvement of up to 2.39% compared to their performance when using only the current utterance. We further analyze the benefit of using current and past utterance information compared to using the current and randomly chosen utterance information, and we find the performance improvement rises to 7.53%. The novelty in this study comes from its formulation of emotion forecasting problems and the understanding of how current and past audio-visual cues reveal future emotional information." @default.
- W2962499391 created "2019-07-23" @default.
- W2962499391 creator A5056643162 @default.
- W2962499391 creator A5070976231 @default.
- W2962499391 date "2019-05-01" @default.
- W2962499391 modified "2023-10-18" @default.
- W2962499391 title "Audio-Visual Emotion Forecasting: Characterizing and Predicting Future Emotion Using Deep Learning" @default.
- W2962499391 cites W1988518729 @default.
- W2962499391 cites W2031984250 @default.
- W2962499391 cites W2064675550 @default.
- W2962499391 cites W2073419240 @default.
- W2962499391 cites W2093174546 @default.
- W2962499391 cites W2147615062 @default.
- W2962499391 cites W2148146486 @default.
- W2962499391 cites W2161459043 @default.
- W2962499391 cites W2167277498 @default.
- W2962499391 cites W2168416146 @default.
- W2962499391 cites W2184507896 @default.
- W2962499391 cites W2546702061 @default.
- W2962499391 cites W2729247037 @default.
- W2962499391 doi "https://doi.org/10.1109/fg.2019.8756599" @default.
- W2962499391 hasPublicationYear "2019" @default.
- W2962499391 type Work @default.
- W2962499391 sameAs 2962499391 @default.
- W2962499391 citedByCount "5" @default.
- W2962499391 countsByYear W29624993912020 @default.
- W2962499391 countsByYear W29624993912021 @default.
- W2962499391 countsByYear W29624993912022 @default.
- W2962499391 countsByYear W29624993912023 @default.
- W2962499391 crossrefType "proceedings-article" @default.
- W2962499391 hasAuthorship W2962499391A5056643162 @default.
- W2962499391 hasAuthorship W2962499391A5070976231 @default.
- W2962499391 hasConcept C100660578 @default.
- W2962499391 hasConcept C108583219 @default.
- W2962499391 hasConcept C119857082 @default.
- W2962499391 hasConcept C13280743 @default.
- W2962499391 hasConcept C147168706 @default.
- W2962499391 hasConcept C154945302 @default.
- W2962499391 hasConcept C15744967 @default.
- W2962499391 hasConcept C162324750 @default.
- W2962499391 hasConcept C180747234 @default.
- W2962499391 hasConcept C185798385 @default.
- W2962499391 hasConcept C187736073 @default.
- W2962499391 hasConcept C204321447 @default.
- W2962499391 hasConcept C205649164 @default.
- W2962499391 hasConcept C2775852435 @default.
- W2962499391 hasConcept C2777438025 @default.
- W2962499391 hasConcept C2780451532 @default.
- W2962499391 hasConcept C28490314 @default.
- W2962499391 hasConcept C41008148 @default.
- W2962499391 hasConcept C50644808 @default.
- W2962499391 hasConceptScore W2962499391C100660578 @default.
- W2962499391 hasConceptScore W2962499391C108583219 @default.
- W2962499391 hasConceptScore W2962499391C119857082 @default.
- W2962499391 hasConceptScore W2962499391C13280743 @default.
- W2962499391 hasConceptScore W2962499391C147168706 @default.
- W2962499391 hasConceptScore W2962499391C154945302 @default.
- W2962499391 hasConceptScore W2962499391C15744967 @default.
- W2962499391 hasConceptScore W2962499391C162324750 @default.
- W2962499391 hasConceptScore W2962499391C180747234 @default.
- W2962499391 hasConceptScore W2962499391C185798385 @default.
- W2962499391 hasConceptScore W2962499391C187736073 @default.
- W2962499391 hasConceptScore W2962499391C204321447 @default.
- W2962499391 hasConceptScore W2962499391C205649164 @default.
- W2962499391 hasConceptScore W2962499391C2775852435 @default.
- W2962499391 hasConceptScore W2962499391C2777438025 @default.
- W2962499391 hasConceptScore W2962499391C2780451532 @default.
- W2962499391 hasConceptScore W2962499391C28490314 @default.
- W2962499391 hasConceptScore W2962499391C41008148 @default.
- W2962499391 hasConceptScore W2962499391C50644808 @default.
- W2962499391 hasLocation W29624993911 @default.
- W2962499391 hasOpenAccess W2962499391 @default.
- W2962499391 hasPrimaryLocation W29624993911 @default.
- W2962499391 hasRelatedWork W2795261237 @default.
- W2962499391 hasRelatedWork W3014300295 @default.
- W2962499391 hasRelatedWork W3164822677 @default.
- W2962499391 hasRelatedWork W4223943233 @default.
- W2962499391 hasRelatedWork W4225161397 @default.
- W2962499391 hasRelatedWork W4312200629 @default.
- W2962499391 hasRelatedWork W4360585206 @default.
- W2962499391 hasRelatedWork W4364306694 @default.
- W2962499391 hasRelatedWork W4380075502 @default.
- W2962499391 hasRelatedWork W4380086463 @default.
- W2962499391 isParatext "false" @default.
- W2962499391 isRetracted "false" @default.
- W2962499391 magId "2962499391" @default.
- W2962499391 workType "article" @default.