Matches in SemOpenAlex for { <https://semopenalex.org/work/W2296306509> ?p ?o ?g. }
Showing items 1 to 43 of
43
with 100 items per page.
- W2296306509 abstract "If a moving image is more expressive than words or than a still image, then an animated facial expression can explain more in depth the feelings of a virtual character. Facial animation has been used in many applications, from entertainment to research on virtual humans and tele-presence. The aim of most of the approaches is to achieve high degrees of realism of virtual characters and is supplemented by complex models of kinematics, muscle movement, movement of clothing as well as cognition and behavioral models. Video avatars and image-based techniques are also used for creating virtual humans. However, the complexity of the geometric and physically simulated facial models used by the above methods make them unsuitable for use in distributed collaborative virtual environments running on low bandwidth networks or over the internet. Therefore, the majority of approaches for such environments are using simplified models of virtual human, which the obvious disadvantage of lower degrees of realism.The Reflective Textures method, presented in this paper, makes use of textures of images of both synthetic faces or faces captured from video and of a simple low-polygon face/head model. It provides an interactive way of fine-tuning and adjusting the underlined model to allow a more realistic mapping for a specific facial image. Furthermore, it concentrates in creating facial expressions by manipulation of the texture. Facial Expressions such as Fear, Happiness, Melancholy or Surprise, blinking of the eyes and movement of the pupils is automatically achieved for any mapped facial image by the system." @default.
- W2296306509 created "2016-06-24" @default.
- W2296306509 creator A5075061575 @default.
- W2296306509 date "2001-11-05" @default.
- W2296306509 modified "2023-09-25" @default.
- W2296306509 title "Expressive textures" @default.
- W2296306509 cites W2001936867 @default.
- W2296306509 cites W2024473482 @default.
- W2296306509 cites W2029916517 @default.
- W2296306509 cites W2132172443 @default.
- W2296306509 cites W2133575725 @default.
- W2296306509 cites W2236592196 @default.
- W2296306509 cites W4231537718 @default.
- W2296306509 cites W4233118564 @default.
- W2296306509 doi "https://doi.org/10.1145/513867.513897" @default.
- W2296306509 hasPublicationYear "2001" @default.
- W2296306509 type Work @default.
- W2296306509 sameAs 2296306509 @default.
- W2296306509 citedByCount "2" @default.
- W2296306509 countsByYear W22963065092014 @default.
- W2296306509 crossrefType "proceedings-article" @default.
- W2296306509 hasAuthorship W2296306509A5075061575 @default.
- W2296306509 hasConcept C154945302 @default.
- W2296306509 hasConcept C41008148 @default.
- W2296306509 hasConceptScore W2296306509C154945302 @default.
- W2296306509 hasConceptScore W2296306509C41008148 @default.
- W2296306509 hasLocation W22963065091 @default.
- W2296306509 hasOpenAccess W2296306509 @default.
- W2296306509 hasPrimaryLocation W22963065091 @default.
- W2296306509 hasRelatedWork W2093578348 @default.
- W2296306509 hasRelatedWork W2350741829 @default.
- W2296306509 hasRelatedWork W2358668433 @default.
- W2296306509 hasRelatedWork W2376932109 @default.
- W2296306509 hasRelatedWork W2382290278 @default.
- W2296306509 hasRelatedWork W2390279801 @default.
- W2296306509 hasRelatedWork W2748952813 @default.
- W2296306509 hasRelatedWork W2899084033 @default.
- W2296306509 hasRelatedWork W3004735627 @default.
- W2296306509 hasRelatedWork W3107474891 @default.
- W2296306509 isParatext "false" @default.
- W2296306509 isRetracted "false" @default.
- W2296306509 magId "2296306509" @default.
- W2296306509 workType "article" @default.