Matches in SemOpenAlex for { <https://semopenalex.org/work/W3087982620> ?p ?o ?g. }
- W3087982620 endingPage "3550" @default.
- W3087982620 startingPage "3540" @default.
- W3087982620 abstract "Automated facial expression analysis from image sequences for continuous emotion recognition is a very challenging task due to the loss of the three-dimensional information during the image formation process. State-of-the-art relied on estimating dynamic textures features and convolutional neural network features to derive spatio-temporal features. Despite their great success, such features are insensitive to micro facial muscle deformations and are affected by identity, face pose, illumination variation, and self-occlusion. In this work, we argue that retrieving, from image sequences, 3D facial spatio-temporal information, which describes the natural facial muscle deformation, provides a semantical and efficient way of representation and is useful for emotion recognition. In this paper, we propose a framework for extracting three-dimensional facial spatio-temporal features from monocular image sequences using an extended 3D Morphable Model (3DMM) which disentangles the identity factor from the facial expressions of a specific person. An LSTM model is used to evaluate the effectiveness of the proposed spatio-temporal features on video-based facial expression recognition task and continuous affect recognition task. Experimental results, on the AFEW6.0 datasets for facial expression recognition, and the RECOLA and SEMAINE datasets for continuous emotion prediction, illustrate the potential of the proposed 3D spatio-temporal features for facial expressions analysis and continuous affect recognition, as well as their efficiency compared to recent state-of-the-art features." @default.
- W3087982620 created "2020-10-01" @default.
- W3087982620 creator A5014851312 @default.
- W3087982620 creator A5049930476 @default.
- W3087982620 creator A5069268336 @default.
- W3087982620 creator A5070632722 @default.
- W3087982620 creator A5073820422 @default.
- W3087982620 date "2021-01-01" @default.
- W3087982620 modified "2023-09-29" @default.
- W3087982620 title "Monocular 3D Facial Expression Features for Continuous Affect Recognition" @default.
- W3087982620 cites W1522734439 @default.
- W3087982620 cites W1545641654 @default.
- W3087982620 cites W1554156806 @default.
- W3087982620 cites W1755205674 @default.
- W3087982620 cites W1871419576 @default.
- W3087982620 cites W1964757081 @default.
- W3087982620 cites W1965947362 @default.
- W3087982620 cites W1977148657 @default.
- W3087982620 cites W1981918162 @default.
- W3087982620 cites W1983364832 @default.
- W3087982620 cites W1983703866 @default.
- W3087982620 cites W1990213678 @default.
- W3087982620 cites W1991060033 @default.
- W3087982620 cites W1999042468 @default.
- W3087982620 cites W2005418748 @default.
- W3087982620 cites W2017107803 @default.
- W3087982620 cites W2021913835 @default.
- W3087982620 cites W2023734821 @default.
- W3087982620 cites W2026243162 @default.
- W3087982620 cites W2039051707 @default.
- W3087982620 cites W2045528981 @default.
- W3087982620 cites W2072803611 @default.
- W3087982620 cites W2084482378 @default.
- W3087982620 cites W2092206588 @default.
- W3087982620 cites W2096171208 @default.
- W3087982620 cites W2107037917 @default.
- W3087982620 cites W2120724312 @default.
- W3087982620 cites W2129671742 @default.
- W3087982620 cites W2134605619 @default.
- W3087982620 cites W2135776491 @default.
- W3087982620 cites W2136655611 @default.
- W3087982620 cites W2139916508 @default.
- W3087982620 cites W2146566773 @default.
- W3087982620 cites W2156489769 @default.
- W3087982620 cites W2156503193 @default.
- W3087982620 cites W2160126058 @default.
- W3087982620 cites W2185452299 @default.
- W3087982620 cites W2188722963 @default.
- W3087982620 cites W2293457052 @default.
- W3087982620 cites W2297337743 @default.
- W3087982620 cites W2339620988 @default.
- W3087982620 cites W2346454595 @default.
- W3087982620 cites W2431101926 @default.
- W3087982620 cites W2490049321 @default.
- W3087982620 cites W2531466839 @default.
- W3087982620 cites W2531648894 @default.
- W3087982620 cites W2546875627 @default.
- W3087982620 cites W2547263832 @default.
- W3087982620 cites W2548128734 @default.
- W3087982620 cites W2582523095 @default.
- W3087982620 cites W2610961739 @default.
- W3087982620 cites W2617750261 @default.
- W3087982620 cites W2621864722 @default.
- W3087982620 cites W2703895418 @default.
- W3087982620 cites W2726381870 @default.
- W3087982620 cites W2765291577 @default.
- W3087982620 cites W2770998336 @default.
- W3087982620 cites W2800840848 @default.
- W3087982620 cites W2801997881 @default.
- W3087982620 cites W2807005919 @default.
- W3087982620 cites W2807126412 @default.
- W3087982620 cites W2885905848 @default.
- W3087982620 cites W2889050557 @default.
- W3087982620 cites W2891588573 @default.
- W3087982620 cites W2896134990 @default.
- W3087982620 cites W2900933500 @default.
- W3087982620 cites W2905628412 @default.
- W3087982620 cites W2912817214 @default.
- W3087982620 cites W2913453026 @default.
- W3087982620 cites W2915606245 @default.
- W3087982620 cites W2963112684 @default.
- W3087982620 cites W2963148250 @default.
- W3087982620 cites W2982949428 @default.
- W3087982620 cites W2796250229 @default.
- W3087982620 doi "https://doi.org/10.1109/tmm.2020.3026894" @default.
- W3087982620 hasPublicationYear "2021" @default.
- W3087982620 type Work @default.
- W3087982620 sameAs 3087982620 @default.
- W3087982620 citedByCount "4" @default.
- W3087982620 countsByYear W30879826202020 @default.
- W3087982620 countsByYear W30879826202023 @default.
- W3087982620 crossrefType "journal-article" @default.
- W3087982620 hasAuthorship W3087982620A5014851312 @default.
- W3087982620 hasAuthorship W3087982620A5049930476 @default.
- W3087982620 hasAuthorship W3087982620A5069268336 @default.
- W3087982620 hasAuthorship W3087982620A5070632722 @default.
- W3087982620 hasAuthorship W3087982620A5073820422 @default.
- W3087982620 hasConcept C121332964 @default.