Matches in SemOpenAlex for { <https://semopenalex.org/work/W3143179197> ?p ?o ?g. }
- W3143179197 abstract "ion layers for controlling facial animation. By controlling animation we understand the way in which user communicates with the computer system about changes necessary to produce an animation. The methods for controlling animation range from explicit control to highly automated control. In case of explicit control, the user has to specify all changes in the animated objects that are needed to generate the resulting animation. It means, he has to define changes in positions, shapes and attributes (e.g. colour, texture) of the objects. Changes in the positions and shapes can be applied by defining new positions for vertices, by defining mathematical operations that have to applied on the animated object (translation, scaling, rotation etc.), or by defining key-frames of animation and desired interpolation method. Explicit control of animation is the most basic, and the most straightforward to implement in the animation system, but at the same time, the most difficult for the animator. It is often used in typical 3D modelling software (e.g. 3D Studio Max, Blender, Maya etc.) Each higher level of animation control is supported by some form of knowledge base integration. It takes description provided by animator on a particular level of abstraction and translates it into explicit control parameters. Depending on the degree of separation from the explicit control parameters, the different levels of abstraction can be defined. An example of the highest possible level of facial animation control could be typing a text to be spoken by a virtual human. A system with high abstraction level of controlling animation should in turn produce a virtual character speaking the typed text and showing behaviourally appropriate facial expressions without any further input from the animator. Currently most of the systems for facial animation provide some higher level of animation control [86, 77, 123, 145]. Further in this section we describe various abstraction levels that are most often used in facial animation systems. 3.2.1 High Level Control of Animation The first step of separating the user from explicit modification of 3D geometries is achieved through introduction of model parameters. The parameters group some vertices of the facial mesh together, and allow for simultaneous changes in their positions. of implementing full OpenGL capabilities! 34 CHAPTER 3. COMPUTATIONAL TECHNIQUES Typically, the parameters are given some intuitive labels, which suggest the type of action performed (such as e.g. “mouth opening”, or “head rotation”). Accompanied with appropriate GUI element (most often a “slider”), they allow for straightforward modifications of the facial appearance. It is a big step away from explicit control of facial geometry, and allows for much more intuitive interaction with the modelling software. Therefore, to some extent it is readily facilitated in many of modern 3D modelling packages. However, as previously described in section 2.3.1, controlling face on this level of abstraction is still tedious, and can result in unrealistic facial expressions (on physiological, or behavioural level). In order to reduce the burden of animation, and to improve the results, many of the facial animation systems, move the user control to higher levels of abstraction. In such systems, the animator is presented with sets of predefined facial expressions related to emotions, conversational emblems, visemes etc. [37]. In a manner mirroring the previously described abstraction step, these predefined expressions are formed by grouping model parameters together. Often, new parameters are introduced, which represent the intensity of given expression. To some extent, this level of animation control is independent of the underlying facial model. The user operates only in terms of abstract facial expressions, which in turn are translated into changes of model parameters. The process of translation is model-dependent, but the description itself is not. In order to further simplify the animator’s task, the timing dependencies between different expressions can be introduced. For example, a written word can be translated into a sequence of viseme-related facial expressions, with appropriate onsets and offsets. The process may be further improved by using speech recognition to synchronise the movements to the recorded utterance [1]. Another typical example of automation on this level, is introduction of blinking at specified, slightly randomised, time intervals. This is a physiologically motivated occurrence, which can be therefore introduced into the animation flow without explicit user intervention [120]. 3.2.2 Scripting Languages Typically, in computer animation, the workflow is concentrated along the time-line, allowing the animator to put different occurrences at specified points in time. This mode of animation design, is certainly the preferred one for people with substantial amount of experience. However, most of the people, think about the passage of time in terms of discourse elements rather than milliseconds, or frames. We think of a smile appearing on someone’s face when he hears good news, or about nodding in response to the question. In this manner, the textual content of the conversation defines the time-scale, and synchronises our facial activity. It is, therefore, desirable to allow for facial animation to be anchored to the concepts of utterances, sentences, and dialogue actions. One of the possible approaches to formalise the notion of speech dependence of facial animation, is to use a markup language to supplement the textual content. Using such approach, the facial animation is scripted rather than visually designed. One of the examples of such approach to facial animation (or rather, character animation) is Virtual Human Markup Language (VHML) [101]. VHML defines a set of tags with their attributes, which can be placed in appropriate places of text, for further interpreta3.3. KNOWLEDGE ENGINEERING 35 tion by the animation system. VHML is completely model agnostic, that is, it does not require any capabilities of the animation system. Depending on the system implementation, some parts of the markup are used for controlling animation, others are ignored. VHML comprises of following sub-languages: • Facial Animation Markup Language (FAML) • Body Animation Markup Language (BAML) • Speech Markup Language (SML) • Dialogue Manager Markup Language (DMML) • Emotion Markup Language (EML) • HyperText Markup Language (HTML) In our case, the most interesting part of the VHML is contained within FAML, which allows for intuitive control of the facial animation. Independently on the details of the scripting language specification, it must contain a set of predefined primitives, which are used to generate animation. In FAML, these primitives are tags with specific meaning, and with associated facial expressions (or sequences thereof). The collection of such primitives is very similar to what we would normally consider a dictionary. For each primitive, a description of its morphology, its syntax, and its semantics is provided. Such nonverbal dictionary can be constructed either in a closed manner (as in case of FAML), without possibility of extending it, or in an open manner, as a system for gathering the facial expressions which are useful in given application. In case of an open nonverbal dictionaries, efficient way of dictionary look-up must be provided, both from the textual, and visual point of view [42]. 3.3 Knowledge Engineering Presentation of various mathematical methods used for processing and handling data. The models and algorithms described in the previous sections are directly related to a visual part of facial animation system. They determine how to model and control facial geometry, how to represent additional attributes such as: textures, surface colour or lightening conditions. In this section we present the computational techniques that were used throughout our processing pipline. We start with description of optimisation method that was used to model basic facial movements. Then we explain the concept of fuzzy logic. It is a problem-solving control system methodology applied in our system to keep the facial model parameters within the allowed facial movement subspace. The last part of this section contains description of two unsupervised methods: Principal Component Analysis (PCA) and Self-Organising Maps (SOM). Their aim was to prepare and process data collected from the recordings in order to extract blocks of frames with relevant facial expressions. 36 CHAPTER 3. COMPUTATIONAL TECHNIQUES" @default.
- W3143179197 created "2021-04-13" @default.
- W3143179197 creator A5087061762 @default.
- W3143179197 date "2005-01-01" @default.
- W3143179197 modified "2023-09-26" @default.
- W3143179197 title "KNOWLEDGE DRIVEN FACIAL MODELLING" @default.
- W3143179197 cites W123208673 @default.
- W3143179197 cites W1490482062 @default.
- W3143179197 cites W1491778306 @default.
- W3143179197 cites W1496403746 @default.
- W3143179197 cites W1499137266 @default.
- W3143179197 cites W1503205402 @default.
- W3143179197 cites W1509796108 @default.
- W3143179197 cites W1510007267 @default.
- W3143179197 cites W1513692383 @default.
- W3143179197 cites W1521793179 @default.
- W3143179197 cites W1535804349 @default.
- W3143179197 cites W1539798903 @default.
- W3143179197 cites W1546206865 @default.
- W3143179197 cites W1554803342 @default.
- W3143179197 cites W1563178652 @default.
- W3143179197 cites W1568712909 @default.
- W3143179197 cites W1569375693 @default.
- W3143179197 cites W1580810326 @default.
- W3143179197 cites W1582580526 @default.
- W3143179197 cites W1588539311 @default.
- W3143179197 cites W1607796654 @default.
- W3143179197 cites W1609907549 @default.
- W3143179197 cites W1679913846 @default.
- W3143179197 cites W181171262 @default.
- W3143179197 cites W1841887418 @default.
- W3143179197 cites W1842100376 @default.
- W3143179197 cites W186053569 @default.
- W3143179197 cites W1864979434 @default.
- W3143179197 cites W1876090506 @default.
- W3143179197 cites W1964725106 @default.
- W3143179197 cites W1968773332 @default.
- W3143179197 cites W1970372638 @default.
- W3143179197 cites W1981710139 @default.
- W3143179197 cites W1985479696 @default.
- W3143179197 cites W1997396624 @default.
- W3143179197 cites W2000366549 @default.
- W3143179197 cites W2004195401 @default.
- W3143179197 cites W2007431552 @default.
- W3143179197 cites W2008208299 @default.
- W3143179197 cites W2014621385 @default.
- W3143179197 cites W2016859189 @default.
- W3143179197 cites W2022012885 @default.
- W3143179197 cites W2027278445 @default.
- W3143179197 cites W2030668560 @default.
- W3143179197 cites W2033353542 @default.
- W3143179197 cites W2045863238 @default.
- W3143179197 cites W2046999604 @default.
- W3143179197 cites W2052459168 @default.
- W3143179197 cites W2055864799 @default.
- W3143179197 cites W2063291721 @default.
- W3143179197 cites W2066142344 @default.
- W3143179197 cites W2066813690 @default.
- W3143179197 cites W2072578292 @default.
- W3143179197 cites W2078689334 @default.
- W3143179197 cites W2082065596 @default.
- W3143179197 cites W2086875012 @default.
- W3143179197 cites W2088875936 @default.
- W3143179197 cites W2096076356 @default.
- W3143179197 cites W2097127206 @default.
- W3143179197 cites W2097755513 @default.
- W3143179197 cites W2102177660 @default.
- W3143179197 cites W2102322178 @default.
- W3143179197 cites W2102416463 @default.
- W3143179197 cites W2104384690 @default.
- W3143179197 cites W2104407509 @default.
- W3143179197 cites W2104737001 @default.
- W3143179197 cites W2106390385 @default.
- W3143179197 cites W2106590237 @default.
- W3143179197 cites W2109888564 @default.
- W3143179197 cites W2109909694 @default.
- W3143179197 cites W2112060768 @default.
- W3143179197 cites W2115162244 @default.
- W3143179197 cites W2116883077 @default.
- W3143179197 cites W2117121484 @default.
- W3143179197 cites W2120654454 @default.
- W3143179197 cites W2122251926 @default.
- W3143179197 cites W2125478079 @default.
- W3143179197 cites W2125848778 @default.
- W3143179197 cites W2128561321 @default.
- W3143179197 cites W2128597598 @default.
- W3143179197 cites W2130702999 @default.
- W3143179197 cites W2132172443 @default.
- W3143179197 cites W2133180260 @default.
- W3143179197 cites W2133775535 @default.
- W3143179197 cites W2137080578 @default.
- W3143179197 cites W2138451337 @default.
- W3143179197 cites W2139941609 @default.
- W3143179197 cites W2140351517 @default.
- W3143179197 cites W2141666139 @default.
- W3143179197 cites W2143082784 @default.
- W3143179197 cites W2143875529 @default.
- W3143179197 cites W2148694408 @default.
- W3143179197 cites W2156428627 @default.
- W3143179197 cites W2156671261 @default.