Matches in SemOpenAlex for { <https://semopenalex.org/work/W2169781073> ?p ?o ?g. }
- W2169781073 abstract "Understanding human behavior in video data is essential in numerous applications including smart surveillance, video annotation/retrieval, and human-computer interaction. Recognizing human interactions is a challenging task due to ambiguity in body articulation, mutual occlusion, and shadows. Past research has focused on a coarse-level recognition of human interactions or on the recognition of a specific gesture of a single body part. It is our objective to develop methods to recognize human actions and interactions at a detailed level. The focus of this research is to develop a framework for recognizing human actions and interactions in color video. This dissertation presents a hierarchical graphical model that unifies multiple-level processing in video computing. The video—color image sequence—is processed at four levels: pixel level, blob level, object level, and event level. A mixture of Gaussian (MOG) model is used at the pixel level to train and classify individual pixel colors. A relaxation labeling with attribute relational graph (ARG) is used at the blob level to merge the pixels into coherent blobs and to register inter-blob relations. At the object level, the poses of individual body parts including head, torso, arms and legs are recognized using individual Bayesian networks (BNs), which are then integrated to obtain an overall body pose. At the event level, the actions of a single person are modeled using a dynamic Bayesian network (DBN) with temporal links between identical nodes of the Bayesian network at time t and t + 1. At this event level, the results of the object-level descriptions for each person are juxtaposed along a common timeline to identify an interaction between two persons. The linguistic ‘verb argument structure’ is used to represent human action in terms of 〈agent-motion-target〉 triplets. Spatial and temporal constraints are used for a decision tree to recognize specific interactions. A meaningful semantic description in terms of 〈subject-verb-object〉 is obtained. Our method provides a user-friendly natural-language description of various human actions and interactions using event semantics. Our system correctly recognizes various human actions involving the motions of the torso, arms and/or legs, and our system achieves semantic descriptions of positive, neutral, and negative interactions between two persons including hand-shaking, standing hand-in-hand, and hugging as the positive interactions, approaching, departing, and pointing as the neutral interactions, and pushing, punching, and kicking as the negative interactions." @default.
- W2169781073 created "2016-06-24" @default.
- W2169781073 creator A5005576710 @default.
- W2169781073 creator A5040124787 @default.
- W2169781073 date "2004-01-01" @default.
- W2169781073 modified "2023-09-23" @default.
- W2169781073 title "A hierarchical graphical model for recognizing human actions and interactions in video" @default.
- W2169781073 cites W149568282 @default.
- W2169781073 cites W1499877760 @default.
- W2169781073 cites W1501400124 @default.
- W2169781073 cites W1501551412 @default.
- W2169781073 cites W1512514296 @default.
- W2169781073 cites W1513861746 @default.
- W2169781073 cites W1515084886 @default.
- W2169781073 cites W1563968658 @default.
- W2169781073 cites W1572849274 @default.
- W2169781073 cites W1576838869 @default.
- W2169781073 cites W1584827144 @default.
- W2169781073 cites W1601567445 @default.
- W2169781073 cites W1651266332 @default.
- W2169781073 cites W1718093723 @default.
- W2169781073 cites W1729060384 @default.
- W2169781073 cites W1820019698 @default.
- W2169781073 cites W1821570920 @default.
- W2169781073 cites W185674804 @default.
- W2169781073 cites W1912392457 @default.
- W2169781073 cites W1972885239 @default.
- W2169781073 cites W1977545325 @default.
- W2169781073 cites W1979923382 @default.
- W2169781073 cites W1983418177 @default.
- W2169781073 cites W1986517740 @default.
- W2169781073 cites W1986912999 @default.
- W2169781073 cites W1987485793 @default.
- W2169781073 cites W1991133427 @default.
- W2169781073 cites W1995757792 @default.
- W2169781073 cites W2014051414 @default.
- W2169781073 cites W2016418141 @default.
- W2169781073 cites W2021057537 @default.
- W2169781073 cites W2025299767 @default.
- W2169781073 cites W2026720449 @default.
- W2169781073 cites W2030989822 @default.
- W2169781073 cites W2034829187 @default.
- W2169781073 cites W2039306215 @default.
- W2169781073 cites W2043053642 @default.
- W2169781073 cites W2051066766 @default.
- W2169781073 cites W2052670720 @default.
- W2169781073 cites W2057245760 @default.
- W2169781073 cites W2083804944 @default.
- W2169781073 cites W2090110089 @default.
- W2169781073 cites W2100115174 @default.
- W2169781073 cites W2100214212 @default.
- W2169781073 cites W2100804244 @default.
- W2169781073 cites W2102188949 @default.
- W2169781073 cites W2105238356 @default.
- W2169781073 cites W2105934661 @default.
- W2169781073 cites W2108337280 @default.
- W2169781073 cites W2110575115 @default.
- W2169781073 cites W2110874157 @default.
- W2169781073 cites W2113122905 @default.
- W2169781073 cites W2113856781 @default.
- W2169781073 cites W2114701396 @default.
- W2169781073 cites W2115213191 @default.
- W2169781073 cites W2118440878 @default.
- W2169781073 cites W2119061688 @default.
- W2169781073 cites W2121899951 @default.
- W2169781073 cites W2124217265 @default.
- W2169781073 cites W2125838338 @default.
- W2169781073 cites W2126030799 @default.
- W2169781073 cites W2126412899 @default.
- W2169781073 cites W2133677025 @default.
- W2169781073 cites W2135192052 @default.
- W2169781073 cites W2140487300 @default.
- W2169781073 cites W2144126588 @default.
- W2169781073 cites W2153968793 @default.
- W2169781073 cites W2159080219 @default.
- W2169781073 cites W2159082724 @default.
- W2169781073 cites W2160517719 @default.
- W2169781073 cites W2161315665 @default.
- W2169781073 cites W2161406034 @default.
- W2169781073 cites W2163661041 @default.
- W2169781073 cites W2164295580 @default.
- W2169781073 cites W2165874398 @default.
- W2169781073 cites W2169805580 @default.
- W2169781073 cites W2171060431 @default.
- W2169781073 cites W2293560592 @default.
- W2169781073 cites W2308663 @default.
- W2169781073 cites W2338905098 @default.
- W2169781073 cites W2471625670 @default.
- W2169781073 cites W2537262416 @default.
- W2169781073 cites W25917992 @default.
- W2169781073 cites W2751023760 @default.
- W2169781073 cites W3173880854 @default.
- W2169781073 cites W164048988 @default.
- W2169781073 cites W2110411756 @default.
- W2169781073 cites W2156394753 @default.
- W2169781073 cites W30289623 @default.
- W2169781073 hasPublicationYear "2004" @default.
- W2169781073 type Work @default.
- W2169781073 sameAs 2169781073 @default.
- W2169781073 citedByCount "3" @default.