Matches in SemOpenAlex for { <https://semopenalex.org/work/W2798536775> ?p ?o ?g. }
- W2798536775 endingPage "929" @default.
- W2798536775 startingPage "907" @default.
- W2798536775 abstract "Automatic understanding of human affect using visual signals is of great importance in everyday human–machine interactions. Appraising human emotional states, behaviors and reactions displayed in real-world settings, can be accomplished using latent continuous dimensions (e.g., the circumplex model of affect). Valence (i.e., how positive or negative is an emotion) and arousal (i.e., power of the activation of the emotion) constitute popular and effective representations for affect. Nevertheless, the majority of collected datasets this far, although containing naturalistic emotional states, have been captured in highly controlled recording conditions. In this paper, we introduce the Aff-Wild benchmark for training and evaluating affect recognition algorithms. We also report on the results of the First Affect-in-the-wild Challenge (Aff-Wild Challenge) that was recently organized in conjunction with CVPR 2017 on the Aff-Wild database, and was the first ever challenge on the estimation of valence and arousal in-the-wild. Furthermore, we design and extensively train an end-to-end deep neural architecture which performs prediction of continuous emotion dimensions based on visual cues. The proposed deep learning architecture, AffWildNet, includes convolutional and recurrent neural network layers, exploiting the invariant properties of convolutional features, while also modeling temporal dynamics that arise in human behavior via the recurrent layers. The AffWildNet produced state-of-the-art results on the Aff-Wild Challenge. We then exploit the AffWild database for learning features, which can be used as priors for achieving best performances both for dimensional, as well as categorical emotion recognition, using the RECOLA, AFEW-VA and EmotiW 2017 datasets, compared to all other methods designed for the same goal. The database and emotion recognition models are available at http://ibug.doc.ic.ac.uk/resources/first-affect-wild-challenge ." @default.
- W2798536775 created "2018-05-07" @default.
- W2798536775 creator A5002732899 @default.
- W2798536775 creator A5017437981 @default.
- W2798536775 creator A5019651318 @default.
- W2798536775 creator A5029879679 @default.
- W2798536775 creator A5045277971 @default.
- W2798536775 creator A5046220058 @default.
- W2798536775 creator A5080553022 @default.
- W2798536775 creator A5082301986 @default.
- W2798536775 date "2019-02-13" @default.
- W2798536775 modified "2023-10-11" @default.
- W2798536775 title "Deep Affect Prediction in-the-Wild: Aff-Wild Database and Challenge, Deep Architectures, and Beyond" @default.
- W2798536775 cites W1849007038 @default.
- W2798536775 cites W1964469912 @default.
- W2798536775 cites W1965696296 @default.
- W2798536775 cites W1965947362 @default.
- W2798536775 cites W1976066595 @default.
- W2798536775 cites W1992227055 @default.
- W2798536775 cites W2002055708 @default.
- W2798536775 cites W2041075748 @default.
- W2798536775 cites W2054541702 @default.
- W2798536775 cites W2056403322 @default.
- W2798536775 cites W2064675550 @default.
- W2798536775 cites W2092206588 @default.
- W2798536775 cites W2101545465 @default.
- W2798536775 cites W2103943262 @default.
- W2798536775 cites W2106390385 @default.
- W2798536775 cites W2108598243 @default.
- W2798536775 cites W2122098299 @default.
- W2798536775 cites W2132555391 @default.
- W2798536775 cites W2143172006 @default.
- W2798536775 cites W2153597356 @default.
- W2798536775 cites W2156503193 @default.
- W2798536775 cites W2161634108 @default.
- W2798536775 cites W2194775991 @default.
- W2798536775 cites W2217426128 @default.
- W2798536775 cites W2311778361 @default.
- W2798536775 cites W2313339984 @default.
- W2798536775 cites W2325939864 @default.
- W2798536775 cites W2345305417 @default.
- W2798536775 cites W2346454595 @default.
- W2798536775 cites W2546649374 @default.
- W2798536775 cites W2587982884 @default.
- W2798536775 cites W2655404332 @default.
- W2798536775 cites W2713788831 @default.
- W2798536775 cites W2739449376 @default.
- W2798536775 cites W2767348466 @default.
- W2798536775 cites W2767618761 @default.
- W2798536775 cites W2912990735 @default.
- W2798536775 cites W2915606245 @default.
- W2798536775 doi "https://doi.org/10.1007/s11263-019-01158-4" @default.
- W2798536775 hasPublicationYear "2019" @default.
- W2798536775 type Work @default.
- W2798536775 sameAs 2798536775 @default.
- W2798536775 citedByCount "177" @default.
- W2798536775 countsByYear W27985367752018 @default.
- W2798536775 countsByYear W27985367752019 @default.
- W2798536775 countsByYear W27985367752020 @default.
- W2798536775 countsByYear W27985367752021 @default.
- W2798536775 countsByYear W27985367752022 @default.
- W2798536775 countsByYear W27985367752023 @default.
- W2798536775 crossrefType "journal-article" @default.
- W2798536775 hasAuthorship W2798536775A5002732899 @default.
- W2798536775 hasAuthorship W2798536775A5017437981 @default.
- W2798536775 hasAuthorship W2798536775A5019651318 @default.
- W2798536775 hasAuthorship W2798536775A5029879679 @default.
- W2798536775 hasAuthorship W2798536775A5045277971 @default.
- W2798536775 hasAuthorship W2798536775A5046220058 @default.
- W2798536775 hasAuthorship W2798536775A5080553022 @default.
- W2798536775 hasAuthorship W2798536775A5082301986 @default.
- W2798536775 hasBestOaLocation W27985367751 @default.
- W2798536775 hasConcept C108583219 @default.
- W2798536775 hasConcept C119857082 @default.
- W2798536775 hasConcept C121332964 @default.
- W2798536775 hasConcept C13280743 @default.
- W2798536775 hasConcept C154945302 @default.
- W2798536775 hasConcept C15744967 @default.
- W2798536775 hasConcept C168900304 @default.
- W2798536775 hasConcept C169760540 @default.
- W2798536775 hasConcept C185798385 @default.
- W2798536775 hasConcept C205649164 @default.
- W2798536775 hasConcept C2776035688 @default.
- W2798536775 hasConcept C36951298 @default.
- W2798536775 hasConcept C41008148 @default.
- W2798536775 hasConcept C46312422 @default.
- W2798536775 hasConcept C5274069 @default.
- W2798536775 hasConcept C62520636 @default.
- W2798536775 hasConcept C81363708 @default.
- W2798536775 hasConceptScore W2798536775C108583219 @default.
- W2798536775 hasConceptScore W2798536775C119857082 @default.
- W2798536775 hasConceptScore W2798536775C121332964 @default.
- W2798536775 hasConceptScore W2798536775C13280743 @default.
- W2798536775 hasConceptScore W2798536775C154945302 @default.
- W2798536775 hasConceptScore W2798536775C15744967 @default.
- W2798536775 hasConceptScore W2798536775C168900304 @default.
- W2798536775 hasConceptScore W2798536775C169760540 @default.
- W2798536775 hasConceptScore W2798536775C185798385 @default.