Matches in SemOpenAlex for { <https://semopenalex.org/work/W2549029155> ?p ?o ?g. }
- W2549029155 endingPage "104" @default.
- W2549029155 startingPage "80" @default.
- W2549029155 abstract "As a result of the convergence of different services delivered over the internet protocol, internet protocol television (IPTV) may be regarded as the one of the most widespread user interfaces accepted by a highly diverse user domain. Every generation, from children to the elderly, can use IPTV for recreation, as well as for gaining social contact and stimulating the mind. However, technological advances in digital platforms go hand in hand with the complexity of their user interfaces, and thus induce technological disinterest and technological exclusion. Therefore, interactivity and affective content presentations are, from the perspective of advanced user interfaces, two key factors in any application incorporating human-computer interaction (HCI). Furthermore, the perception and understanding of the information (meaning) conveyed is closely interlinked with visual cues and non-verbal elements that speakers generate throughout human-human dialogues. In this regard, co-verbal behavior provides information to the communicative act. It supports the speaker's communicative goal and allows for a variety of other information to be added to his/her messages, including (but not limited to) psychological states, attitudes, and personality. In the present paper, we address complexity and technological disinterest through the integration of natural, human-like multimodal output that incorporates a novel combined data- and rule-driven co-verbal behavior generator that is able to extract features from unannotated, general text. The core of the paper discusses the processes that model and synchronize non-verbal features with verbal features even when dealing with unknown context and/or limited contextual information. In addition, the proposed algorithm incorporates data-driven (speech prosody, repository of motor skills) and rule-based concepts (grammar, gesticon). The algorithm firstly classifies the communicative intent, then plans the co-verbal cues and their form within the gesture unit, generates temporally synchronized co-verbal cues, and finally realizes them in the form of human-like co-verbal movements. In this way, the information can be represented in the form of both meaningfully and temporally synchronized co-verbal cues with accompanying synthesized speech, using communication channels to which people are most accustomed. Automatic planning, designing, and recreation of co-verbal behavior for Smart IPTV system, named UMB-SmartTV.Procedures and algorithms for modeling the conversational dialog.TTS- and data-driven expressive model for generating co-verbal behavior.Semiotic classification of intent incorporating linguistic and prosodic cues.Visual prosody reflecting features of speech signal and context of input text." @default.
- W2549029155 created "2016-11-11" @default.
- W2549029155 creator A5014206455 @default.
- W2549029155 creator A5072645144 @default.
- W2549029155 creator A5074475740 @default.
- W2549029155 date "2017-01-01" @default.
- W2549029155 modified "2023-09-24" @default.
- W2549029155 title "The TTS-driven affective embodied conversational agent EVA, based on a novel conversational-behavior generation algorithm" @default.
- W2549029155 cites W102610402 @default.
- W2549029155 cites W114912394 @default.
- W2549029155 cites W1157478223 @default.
- W2549029155 cites W1494506880 @default.
- W2549029155 cites W1501420047 @default.
- W2549029155 cites W1547861087 @default.
- W2549029155 cites W1552503841 @default.
- W2549029155 cites W1570895899 @default.
- W2549029155 cites W1598056941 @default.
- W2549029155 cites W1650169135 @default.
- W2549029155 cites W1762175323 @default.
- W2549029155 cites W184529054 @default.
- W2549029155 cites W1863110574 @default.
- W2549029155 cites W1874400086 @default.
- W2549029155 cites W1918373437 @default.
- W2549029155 cites W1938008383 @default.
- W2549029155 cites W1970354675 @default.
- W2549029155 cites W1971901154 @default.
- W2549029155 cites W1973183478 @default.
- W2549029155 cites W1974597084 @default.
- W2549029155 cites W1980763452 @default.
- W2549029155 cites W1987161267 @default.
- W2549029155 cites W1994616049 @default.
- W2549029155 cites W1996923097 @default.
- W2549029155 cites W1997535568 @default.
- W2549029155 cites W2000749609 @default.
- W2549029155 cites W2001221022 @default.
- W2549029155 cites W2007099190 @default.
- W2549029155 cites W2009955203 @default.
- W2549029155 cites W2017354274 @default.
- W2549029155 cites W2019969759 @default.
- W2549029155 cites W2026096544 @default.
- W2549029155 cites W2027454453 @default.
- W2549029155 cites W2027834570 @default.
- W2549029155 cites W2032233641 @default.
- W2549029155 cites W2033068372 @default.
- W2549029155 cites W2036538621 @default.
- W2549029155 cites W2039428987 @default.
- W2549029155 cites W2046834291 @default.
- W2549029155 cites W2048557880 @default.
- W2549029155 cites W2053694096 @default.
- W2549029155 cites W2059216172 @default.
- W2549029155 cites W2072291547 @default.
- W2549029155 cites W2080702557 @default.
- W2549029155 cites W2087003892 @default.
- W2549029155 cites W2112140734 @default.
- W2549029155 cites W2119048633 @default.
- W2549029155 cites W2119155541 @default.
- W2549029155 cites W2119817702 @default.
- W2549029155 cites W2126822690 @default.
- W2549029155 cites W2127597821 @default.
- W2549029155 cites W2137986763 @default.
- W2549029155 cites W2142508276 @default.
- W2549029155 cites W2145920368 @default.
- W2549029155 cites W2150789257 @default.
- W2549029155 cites W2160524967 @default.
- W2549029155 cites W2161676006 @default.
- W2549029155 cites W2165271834 @default.
- W2549029155 cites W2167171913 @default.
- W2549029155 cites W2252093555 @default.
- W2549029155 cites W2277088878 @default.
- W2549029155 cites W2283705147 @default.
- W2549029155 cites W2345627070 @default.
- W2549029155 cites W237246548 @default.
- W2549029155 cites W256480828 @default.
- W2549029155 cites W302528027 @default.
- W2549029155 cites W409253741 @default.
- W2549029155 cites W4239664927 @default.
- W2549029155 cites W64064869 @default.
- W2549029155 cites W932206313 @default.
- W2549029155 cites W992055789 @default.
- W2549029155 doi "https://doi.org/10.1016/j.engappai.2016.10.006" @default.
- W2549029155 hasPublicationYear "2017" @default.
- W2549029155 type Work @default.
- W2549029155 sameAs 2549029155 @default.
- W2549029155 citedByCount "23" @default.
- W2549029155 countsByYear W25490291552018 @default.
- W2549029155 countsByYear W25490291552019 @default.
- W2549029155 countsByYear W25490291552020 @default.
- W2549029155 countsByYear W25490291552021 @default.
- W2549029155 countsByYear W25490291552022 @default.
- W2549029155 countsByYear W25490291552023 @default.
- W2549029155 crossrefType "journal-article" @default.
- W2549029155 hasAuthorship W2549029155A5014206455 @default.
- W2549029155 hasAuthorship W2549029155A5072645144 @default.
- W2549029155 hasAuthorship W2549029155A5074475740 @default.
- W2549029155 hasConcept C100609095 @default.
- W2549029155 hasConcept C107457646 @default.
- W2549029155 hasConcept C11413529 @default.
- W2549029155 hasConcept C136764020 @default.