Matches in SemOpenAlex for { <https://semopenalex.org/work/W32085468> ?p ?o ?g. }
- W32085468 abstract "This thesis addresses the problem of tracking the focus of attention of people. In particular, a system to track the focus of attention of participants in meetings is developed. Obtaining knowledge about a person’s focus of attention is an important step towards a better understanding of what people do, how and with what or whom they interact or to what they refer. In meetings, focus of attention can be used to disambiguate the addressees of speech acts, to analyze interaction and for indexing of meeting transcripts. Tracking a user’s focus of attention also greatly contributes to the improvement of human-computer interfaces since it can be used to build interfaces and environments that become aware of what the user is paying attention to or with what or whom he is interacting. The direction in which people look; i.e., their gaze, is closely related to their focus of attention. In this thesis, we estimate a subject’s focus of attention based on his or her head orientation. While the direction in which someone looks is determined by head orientation and eye gaze, relevant literature suggests that head orientation alone is a sufficient cue for the detection of someone’s direction of attention during social interaction. We present experimental results from a user study and from several recorded meetings that support this hypothesis. We have developed a Bayesian approach to model at whom or what someone is looking based on his or her head orientation. To estimate head orientations in meetings, the participants’ faces are automatically tracked in the view of a panoramic camera and neural networks are used to estimate their head orientations from pre-processed images of their faces. Using this approach, the focus of attention target of subjects could be correctly identified during 73% of the time in a number of evaluation meetings with four participants. In addition, we have investigated whether a person’s focus of attention can be predicted from other cues. Our results show that focus of attention is correlated to who is speaking in a meeting and that it is possible to predict a person’s focus of attention based on the information of who is talking or was talking before a given moment. We have trained neural networks to predict at whom a person is looking, based on information about who was speaking. Using this approach we were able to predict who is looking at whom with 63% accuracy on the evaluation meetings using only information about who was speaking. We show that by using both head orientation and speaker information to estimate a person’s focus, the accuracy of focus detection can be improved compared to just using one of the modalities for focus estimation. To demonstrate the generality of our approach, we have built a prototype system to demonstrate focus-aware interaction with a household robot and other smart appliances in a room using the developed components for focus of attention tracking. In the demonstration environment, a subject could interact with a simulated household robot, a speech-enabled VCR or with other people in the room, and the recipient of the subject’s speech was disambiguated based on the user’s direction of attention." @default.
- W32085468 created "2016-06-24" @default.
- W32085468 creator A5087051920 @default.
- W32085468 date "2002-01-01" @default.
- W32085468 modified "2023-09-27" @default.
- W32085468 title "Tracking and modeling focus of attention in meetings." @default.
- W32085468 cites W103561777 @default.
- W32085468 cites W1497599070 @default.
- W32085468 cites W1507007223 @default.
- W32085468 cites W1510226565 @default.
- W32085468 cites W1520957522 @default.
- W32085468 cites W1531905487 @default.
- W32085468 cites W1535782774 @default.
- W32085468 cites W1553305517 @default.
- W32085468 cites W1554663460 @default.
- W32085468 cites W1557036733 @default.
- W32085468 cites W1563178652 @default.
- W32085468 cites W1565415357 @default.
- W32085468 cites W1571461735 @default.
- W32085468 cites W1595229305 @default.
- W32085468 cites W162958911 @default.
- W32085468 cites W1722608041 @default.
- W32085468 cites W1773016195 @default.
- W32085468 cites W179500209 @default.
- W32085468 cites W1795528953 @default.
- W32085468 cites W1829958643 @default.
- W32085468 cites W1932558918 @default.
- W32085468 cites W1948162158 @default.
- W32085468 cites W1968080309 @default.
- W32085468 cites W1970320256 @default.
- W32085468 cites W1972212822 @default.
- W32085468 cites W1978337346 @default.
- W32085468 cites W1980501707 @default.
- W32085468 cites W1983313055 @default.
- W32085468 cites W1995302125 @default.
- W32085468 cites W2004496150 @default.
- W32085468 cites W2008779793 @default.
- W32085468 cites W2011126660 @default.
- W32085468 cites W2012992404 @default.
- W32085468 cites W201959675 @default.
- W32085468 cites W2023205241 @default.
- W32085468 cites W2024436171 @default.
- W32085468 cites W2045370322 @default.
- W32085468 cites W2054802006 @default.
- W32085468 cites W2067447093 @default.
- W32085468 cites W2076332785 @default.
- W32085468 cites W2077324971 @default.
- W32085468 cites W2078298881 @default.
- W32085468 cites W2078385108 @default.
- W32085468 cites W2080410062 @default.
- W32085468 cites W2080896322 @default.
- W32085468 cites W2081299070 @default.
- W32085468 cites W2093353037 @default.
- W32085468 cites W2098154993 @default.
- W32085468 cites W2098693229 @default.
- W32085468 cites W2098947662 @default.
- W32085468 cites W2101393962 @default.
- W32085468 cites W2103289133 @default.
- W32085468 cites W2103508613 @default.
- W32085468 cites W2107738794 @default.
- W32085468 cites W2110129072 @default.
- W32085468 cites W2110179992 @default.
- W32085468 cites W2113850638 @default.
- W32085468 cites W2114528521 @default.
- W32085468 cites W2118924889 @default.
- W32085468 cites W2121449541 @default.
- W32085468 cites W2121478585 @default.
- W32085468 cites W2124229187 @default.
- W32085468 cites W2124817418 @default.
- W32085468 cites W2125713050 @default.
- W32085468 cites W2128272608 @default.
- W32085468 cites W2130763609 @default.
- W32085468 cites W2132799961 @default.
- W32085468 cites W2133671888 @default.
- W32085468 cites W2135212469 @default.
- W32085468 cites W2135293965 @default.
- W32085468 cites W2136195777 @default.
- W32085468 cites W2136451251 @default.
- W32085468 cites W2137785778 @default.
- W32085468 cites W2139481310 @default.
- W32085468 cites W2139880049 @default.
- W32085468 cites W2141047752 @default.
- W32085468 cites W2143938301 @default.
- W32085468 cites W2145167175 @default.
- W32085468 cites W2154422144 @default.
- W32085468 cites W2155511848 @default.
- W32085468 cites W2157023096 @default.
- W32085468 cites W2158275940 @default.
- W32085468 cites W2158683916 @default.
- W32085468 cites W2159686933 @default.
- W32085468 cites W2163945819 @default.
- W32085468 cites W2164007032 @default.
- W32085468 cites W2164598857 @default.
- W32085468 cites W2168190988 @default.
- W32085468 cites W2217896605 @default.
- W32085468 cites W2304299084 @default.
- W32085468 cites W2402700212 @default.
- W32085468 cites W2740373864 @default.
- W32085468 cites W2751379301 @default.
- W32085468 cites W3038101704 @default.