Matches in SemOpenAlex for { <https://semopenalex.org/work/W4210324062> ?p ?o ?g. }
- W4210324062 endingPage "1354" @default.
- W4210324062 startingPage "1342" @default.
- W4210324062 abstract "Abstract Concerning facial expression generation, relying on the sheer volume of training data, recent advances on generative models allow high-quality generation of facial expressions free of the laborious facial expression annotating procedure. However, these generative processes have limited relevance to the psychological conceptualised dimensional plane, i.e., the Arousal-Valence two-dimensional plane, resulting in the generation of psychological uninterpretable facial expressions. For this, in this research, we seek to present a novel generative model, targeting learning the psychological compatible (low-dimensional) representations of facial expressions to permit the generation of facial expressions along the psychological conceptualised Arousal-Valence dimensions. To generate Arousal-Valence compatible facial expressions, we resort to a novel form of the data-driven generative model, i.e., the encapsulated variational auto-encoders (EVAE), which is consisted of two connected variational auto-encoders. Two harnessed variational auto-encoders in our EVAE model are concatenated with a tuneable continuous hyper-parameter, which bounds the learning of EVAE. Since this tuneable hyper-parameter, along with the linearly sampled inputs, largely determine the process of generating facial expressions, we hypothesise the correspondence between continuous scales on the hyper-parameter and sampled inputs, and the psychological conceptualised Arousal-Valence dimensions. For empirical validations, two public released facial expression datasets, e.g., the Frey faces and FERG-DB datasets, were employed here to evaluate the dimensional generative performance of our proposed EVAE. Across two datasets, the generated facial expressions along our two hypothesised continuous scales were observed in consistent with the psychological conceptualised Arousal-Valence dimensions. Applied our proposed EVAE model to the Frey faces and FERG-DB facial expression datasets, we demonstrate the feasibility of generating facial expressions along with the conceptualised Arousal-Valence dimensions. In conclusion, to generate facial expressions along the psychological conceptualised Arousal-Valance dimensions, we propose a novel type of generative model, i.e., encapsulated variational auto-encoders (EVAE), allowing the generation process to be disentangled into two tuneable continuous factors. Validated in two publicly available facial expression datasets, we demonstrate the association between these factors and Arousal-Valence dimensions in facial expression generation, deriving the data-driven Arousal-Valence plane in affective computing. Despite its embryonic stage, our research may shed light on the prospect of continuous, dimensional affective computing." @default.
- W4210324062 created "2022-02-08" @default.
- W4210324062 creator A5001494883 @default.
- W4210324062 creator A5071558676 @default.
- W4210324062 creator A5087237198 @default.
- W4210324062 date "2022-01-31" @default.
- W4210324062 modified "2023-09-30" @default.
- W4210324062 title "Data-driven Dimensional Expression Generation via Encapsulated Variational Auto-Encoders" @default.
- W4210324062 cites W164919882 @default.
- W4210324062 cites W177655890 @default.
- W4210324062 cites W1965947362 @default.
- W4210324062 cites W1967993123 @default.
- W4210324062 cites W2024218186 @default.
- W4210324062 cites W2024289965 @default.
- W4210324062 cites W2024639682 @default.
- W4210324062 cites W2092206588 @default.
- W4210324062 cites W211912913 @default.
- W4210324062 cites W2135776491 @default.
- W4210324062 cites W2144961093 @default.
- W4210324062 cites W2153822685 @default.
- W4210324062 cites W2165857685 @default.
- W4210324062 cites W2169166781 @default.
- W4210324062 cites W2217402295 @default.
- W4210324062 cites W2219827282 @default.
- W4210324062 cites W2343758848 @default.
- W4210324062 cites W2593367875 @default.
- W4210324062 cites W2745497104 @default.
- W4210324062 cites W2755009893 @default.
- W4210324062 cites W2990842302 @default.
- W4210324062 cites W3091860120 @default.
- W4210324062 doi "https://doi.org/10.1007/s12559-021-09973-z" @default.
- W4210324062 hasPublicationYear "2022" @default.
- W4210324062 type Work @default.
- W4210324062 citedByCount "0" @default.
- W4210324062 crossrefType "journal-article" @default.
- W4210324062 hasAuthorship W4210324062A5001494883 @default.
- W4210324062 hasAuthorship W4210324062A5071558676 @default.
- W4210324062 hasAuthorship W4210324062A5087237198 @default.
- W4210324062 hasBestOaLocation W42103240621 @default.
- W4210324062 hasConcept C101738243 @default.
- W4210324062 hasConcept C108583219 @default.
- W4210324062 hasConcept C111919701 @default.
- W4210324062 hasConcept C118505674 @default.
- W4210324062 hasConcept C121332964 @default.
- W4210324062 hasConcept C153180895 @default.
- W4210324062 hasConcept C154945302 @default.
- W4210324062 hasConcept C15744967 @default.
- W4210324062 hasConcept C167966045 @default.
- W4210324062 hasConcept C168900304 @default.
- W4210324062 hasConcept C180747234 @default.
- W4210324062 hasConcept C195704467 @default.
- W4210324062 hasConcept C199360897 @default.
- W4210324062 hasConcept C2777375102 @default.
- W4210324062 hasConcept C2779302386 @default.
- W4210324062 hasConcept C28490314 @default.
- W4210324062 hasConcept C36951298 @default.
- W4210324062 hasConcept C39890363 @default.
- W4210324062 hasConcept C41008148 @default.
- W4210324062 hasConcept C62520636 @default.
- W4210324062 hasConcept C77805123 @default.
- W4210324062 hasConcept C90559484 @default.
- W4210324062 hasConceptScore W4210324062C101738243 @default.
- W4210324062 hasConceptScore W4210324062C108583219 @default.
- W4210324062 hasConceptScore W4210324062C111919701 @default.
- W4210324062 hasConceptScore W4210324062C118505674 @default.
- W4210324062 hasConceptScore W4210324062C121332964 @default.
- W4210324062 hasConceptScore W4210324062C153180895 @default.
- W4210324062 hasConceptScore W4210324062C154945302 @default.
- W4210324062 hasConceptScore W4210324062C15744967 @default.
- W4210324062 hasConceptScore W4210324062C167966045 @default.
- W4210324062 hasConceptScore W4210324062C168900304 @default.
- W4210324062 hasConceptScore W4210324062C180747234 @default.
- W4210324062 hasConceptScore W4210324062C195704467 @default.
- W4210324062 hasConceptScore W4210324062C199360897 @default.
- W4210324062 hasConceptScore W4210324062C2777375102 @default.
- W4210324062 hasConceptScore W4210324062C2779302386 @default.
- W4210324062 hasConceptScore W4210324062C28490314 @default.
- W4210324062 hasConceptScore W4210324062C36951298 @default.
- W4210324062 hasConceptScore W4210324062C39890363 @default.
- W4210324062 hasConceptScore W4210324062C41008148 @default.
- W4210324062 hasConceptScore W4210324062C62520636 @default.
- W4210324062 hasConceptScore W4210324062C77805123 @default.
- W4210324062 hasConceptScore W4210324062C90559484 @default.
- W4210324062 hasFunder F4320321001 @default.
- W4210324062 hasIssue "4" @default.
- W4210324062 hasLocation W42103240621 @default.
- W4210324062 hasOpenAccess W4210324062 @default.
- W4210324062 hasPrimaryLocation W42103240621 @default.
- W4210324062 hasRelatedWork W1936111920 @default.
- W4210324062 hasRelatedWork W2080898270 @default.
- W4210324062 hasRelatedWork W2105112108 @default.
- W4210324062 hasRelatedWork W2377787444 @default.
- W4210324062 hasRelatedWork W2569748532 @default.
- W4210324062 hasRelatedWork W2903766720 @default.
- W4210324062 hasRelatedWork W3040110385 @default.
- W4210324062 hasRelatedWork W3165374368 @default.
- W4210324062 hasRelatedWork W3180556272 @default.
- W4210324062 hasRelatedWork W4210324062 @default.