Matches in SemOpenAlex for { <https://semopenalex.org/work/W2286118897> ?p ?o ?g. }
- W2286118897 startingPage "856" @default.
- W2286118897 abstract "An Operational Model of Joint Attention - Timing of Gaze Patterns in Interactions between Humans and a Virtual Human Nadine Pfeiffer-Lessmann (nlessman@techfak.uni-bielefeld.de) Thies Pfeiffer (tpfeiffe@techfak.uni-bielefeld.de) Ipke Wachsmuth (ipke@techfak.uni-bielefeld.de) Artificial Intelligence Group, Faculty of Technology, Bielefeld University, Bielefeld, Germany Abstract We constructed an operational model of joint atten- tion (Pfeiffer-Lessmann & Wachsmuth, 2009) for our vir- tual human Max (Lessmann, Kopp, & Wachsmuth, 2006) to create a more natural and effective interaction partner. The model covers four phases: the initiate-act (1), the respond-act (2), the feedback phase (3), and the focus-state (4). However, for Max to apppear believable and to use the same behavior patterns in the phases as humans do, investigations on time- frames, human expectations and insights on how humans ac- tually perceive his behavior are indispensable. The topic of concrete reaction and duration times of feedback behaviors during the joint attention process has to our knowledge not been discussed in the area of human-computer interaction yet. The time-frames and expectations of humans for natural in- teractions are central subject of this paper. In the section to follow, we provide an overview on related work covering research on joint attention in human-human interaction and in the area of technical systems. In the sub- sequent ”Model” section, a brief summary of our own defini- tion of joint attention is provided. Next, we present a study in immersive virtual reality concerning the exact timing of the first phase, the initiate-act, of our joint attention model. Thereafter, results are discussed and the paper ends with our conclusions and future work. Joint attention has been identified as a foundational skill in human-human interaction. If virtual humans are to engage in joint attention, they have to meet the expectations of their human interaction partner and provide interactional signals in a natural way. This requires operational models of joint at- tention with precise information on natural gaze timing. We substantiate our model of the joint attention process by study- ing human-agent interactions in immersive virtual reality and present results on the timing of referential gaze during the ini- tiation of joint attention. Keywords: joint attention; virtual humans; social interaction Introduction Attention has been characterized as an increased aware- ness (Brinck, 2003) and intentionally directed perception (Tomasello, Carpenter, Call, Behne, & Moll, 2005) and is judged to be crucial for goal-directed behavior. Joint atten- tion builds on attentional processes and has been identified to be a foundational skill in communication and interaction. The term joint attention is often used confusably with shared at- tention. We follow Kaplan and Hafner (2006) and Tomasello et al. (2005) in using the term joint attention for the phe- nomenon which presupposes a higher level of interactivity re- quiring intentional behavior and an awareness of the interac- tion partner. Joint attention can be defined as simultaneously allocating attention to a target as a consequence of attending to each other’s attentional states (Deak, Fasel, & Movellan, 2001). In contrast, we see shared attention (as well as shared gaze) as the state in which interactants are just perceiving the same object simultaneously without further constraints con- cerning their mental states or their interaction history. Mundy and Newell (2007) differentiate joint attention be- haviors into two categories: responses to the bids of others and spontaneous initiations. Responding to joint attention refers to the ability to follow the direction of gaze and ges- tures of others in order to share a reference. On the other hand, to initiate joint attention humans use gestures and eye contact to direct the attention of others to objects, events, and to themselves. For joint attention, interlocutors have to deliberatively fo- cus on the same target while being mutually aware of shar- ing their focus of attention (Tomasello et al., 2005; Hobson, 2005). To this end, respond and feedback behaviors are nec- essary. Tasker and Schmidt (2008) argue that to establish joint attention a sequence of behaviors is required which has to meet certain time constraints. Related Work Staudte and Crocker (2011) raise the question whether joint- attention-like behavior is unique to human-human interaction or whether such behaviors can play a similar role in human- robot interaction. They conclude that their own findings sug- gest that humans treat artificial interaction partners similar to humans and that it is therefore valid to investigate joint atten- tion in settings with artificial agents. These artificial agents can consist, on the one hand, of robots (Deak et al., 2001; Imai, Ono, & Ishiguro, 2003; Breazeal et al., 2004; Doniec, Sun, & Scassellati, 2006; Na- gai, Asada, & Hosoda, 2006; Yu, Schermerhorn, & Scheutz, 2012; Huang & Thomaz, 2011; Staudte & Crocker, 2011) and, on the other hand, of virtual humans (Peters, Asteri- adis, & Karpouzis, 2009; Zhang, Fricker, & Yu, 2010; Bailly, Raidt, & Elisei, 2010). Kaplan and Hafner (2006) point out that research in robotics concentrates only on partial and isolated elements of joint attention (e.g. gaze following, simultaneous looking or simple coordinated behavior) covering solely the surface of the process but not addressing the deeper, more cognitive" @default.
- W2286118897 created "2016-06-24" @default.
- W2286118897 creator A5010063991 @default.
- W2286118897 creator A5044005817 @default.
- W2286118897 creator A5084560775 @default.
- W2286118897 date "2012-01-01" @default.
- W2286118897 modified "2023-09-27" @default.
- W2286118897 title "An Operational Model of Joint Attention - Timing of Gaze Patterns in Interactions between Humans and a Virtual Human" @default.
- W2286118897 cites W1500777653 @default.
- W2286118897 cites W1550809802 @default.
- W2286118897 cites W1625049564 @default.
- W2286118897 cites W1708434726 @default.
- W2286118897 cites W2011653618 @default.
- W2286118897 cites W2025954572 @default.
- W2286118897 cites W2026429632 @default.
- W2286118897 cites W2039954205 @default.
- W2286118897 cites W2043004060 @default.
- W2286118897 cites W2078845431 @default.
- W2286118897 cites W2106980598 @default.
- W2286118897 cites W2113170322 @default.
- W2286118897 cites W2125803770 @default.
- W2286118897 cites W2130813755 @default.
- W2286118897 cites W2144576651 @default.
- W2286118897 cites W2160145433 @default.
- W2286118897 cites W2160324900 @default.
- W2286118897 cites W2276069831 @default.
- W2286118897 cites W2540886653 @default.
- W2286118897 cites W85923177 @default.
- W2286118897 cites W187124194 @default.
- W2286118897 hasPublicationYear "2012" @default.
- W2286118897 type Work @default.
- W2286118897 sameAs 2286118897 @default.
- W2286118897 citedByCount "5" @default.
- W2286118897 countsByYear W22861188972013 @default.
- W2286118897 countsByYear W22861188972014 @default.
- W2286118897 countsByYear W22861188972017 @default.
- W2286118897 countsByYear W22861188972018 @default.
- W2286118897 countsByYear W22861188972021 @default.
- W2286118897 crossrefType "proceedings-article" @default.
- W2286118897 hasAuthorship W2286118897A5010063991 @default.
- W2286118897 hasAuthorship W2286118897A5044005817 @default.
- W2286118897 hasAuthorship W2286118897A5084560775 @default.
- W2286118897 hasConcept C107457646 @default.
- W2286118897 hasConcept C111919701 @default.
- W2286118897 hasConcept C127413603 @default.
- W2286118897 hasConcept C137878579 @default.
- W2286118897 hasConcept C138496976 @default.
- W2286118897 hasConcept C150303390 @default.
- W2286118897 hasConcept C154945302 @default.
- W2286118897 hasConcept C15744967 @default.
- W2286118897 hasConcept C166957645 @default.
- W2286118897 hasConcept C170154142 @default.
- W2286118897 hasConcept C18555067 @default.
- W2286118897 hasConcept C194969405 @default.
- W2286118897 hasConcept C205778803 @default.
- W2286118897 hasConcept C2776608160 @default.
- W2286118897 hasConcept C2779916870 @default.
- W2286118897 hasConcept C41008148 @default.
- W2286118897 hasConcept C95457728 @default.
- W2286118897 hasConcept C98045186 @default.
- W2286118897 hasConceptScore W2286118897C107457646 @default.
- W2286118897 hasConceptScore W2286118897C111919701 @default.
- W2286118897 hasConceptScore W2286118897C127413603 @default.
- W2286118897 hasConceptScore W2286118897C137878579 @default.
- W2286118897 hasConceptScore W2286118897C138496976 @default.
- W2286118897 hasConceptScore W2286118897C150303390 @default.
- W2286118897 hasConceptScore W2286118897C154945302 @default.
- W2286118897 hasConceptScore W2286118897C15744967 @default.
- W2286118897 hasConceptScore W2286118897C166957645 @default.
- W2286118897 hasConceptScore W2286118897C170154142 @default.
- W2286118897 hasConceptScore W2286118897C18555067 @default.
- W2286118897 hasConceptScore W2286118897C194969405 @default.
- W2286118897 hasConceptScore W2286118897C205778803 @default.
- W2286118897 hasConceptScore W2286118897C2776608160 @default.
- W2286118897 hasConceptScore W2286118897C2779916870 @default.
- W2286118897 hasConceptScore W2286118897C41008148 @default.
- W2286118897 hasConceptScore W2286118897C95457728 @default.
- W2286118897 hasConceptScore W2286118897C98045186 @default.
- W2286118897 hasIssue "34" @default.
- W2286118897 hasLocation W22861188971 @default.
- W2286118897 hasOpenAccess W2286118897 @default.
- W2286118897 hasPrimaryLocation W22861188971 @default.
- W2286118897 hasRelatedWork W2036130142 @default.
- W2286118897 hasRelatedWork W2039954205 @default.
- W2286118897 hasRelatedWork W205311872 @default.
- W2286118897 hasRelatedWork W2115918068 @default.
- W2286118897 hasRelatedWork W2150839178 @default.
- W2286118897 hasRelatedWork W2167950024 @default.
- W2286118897 hasRelatedWork W2440172120 @default.
- W2286118897 hasRelatedWork W2495595815 @default.
- W2286118897 hasRelatedWork W2543757304 @default.
- W2286118897 hasRelatedWork W2555767354 @default.
- W2286118897 hasRelatedWork W2758009964 @default.
- W2286118897 hasRelatedWork W2899686780 @default.
- W2286118897 hasRelatedWork W2945923602 @default.
- W2286118897 hasRelatedWork W3022971079 @default.
- W2286118897 hasRelatedWork W3183939435 @default.
- W2286118897 hasRelatedWork W3192452456 @default.
- W2286118897 hasRelatedWork W2168314650 @default.