Matches in SemOpenAlex for { <https://semopenalex.org/work/W2950554015> ?p ?o ?g. }
Showing items 1 to 69 of
69
with 100 items per page.
- W2950554015 abstract "A dyadic interaction is a behavioral exchange between two people. In this thesis a computer framework is presented that can localize and classify fine-grained dyadic interactions, such as a handshake, a hug or passing an object from one person to another. In artificial intelligence tasks like these are commonly referred to as human interaction recognition. Our method can be trained on videos with accompanying metadata of the poses of the individuals involved in the interactions, to automatically recognize dyadic interactions. Instead of focusing on interactions that are visually very different, such as kicking and punching, we look at visually similar interactions, such as shaking hands and passing an object. We give special attention to the fine-grained differences between these types of interactions. The interactions we consider for this thesis usually involve physical contact, but our method is not limited to these types of interactions. Focusing on the localization and classification of fine-grained dyadic interactions is a challenging task. Solving this problem is important because of the many types of different applications which lie in its prospects. Human interaction recognition plays an important role in surveillance, video search and automated video captioning. Aside from the different type of applications that successful interaction recognition models may produce, they can also play an important role in human behavior understanding and the study of social development. For many years automatically finding and labeling interactions has been beyond the capabilities of computer systems and artificial intelligence. In this thesis we introduce two data sets specifically designed for the task of dyadic human interaction detection and we describe a spatio-temporal model that contains pose and motion features in a graph of deformable body parts. We set our model up by first finding the moment during an interaction that is most representative of its particular class. We call this frame its epitome. Our model is created from the epitome to describe the temporal build-up towards it and the run out of the interaction afterwards. Over this course of time, we describe the dyadic interaction poses for different limbs using Histograms of Oriented Gradients (HOG) and the accompanying motions using Histograms of Optical Flow (HOF) and Motion Boundary Histograms (MBH). We show that we can train our model from relatively few examples. We test its robustness when the amount of available training data is extremely limited and we look at the use of auxiliary images to leverage training in these cases. On testing on unsegmented videos, our framework returns labeled spatio-temporal tubes that cover an interaction precisely. Our experiments show that our models generalize well to different environments. Next to its performance our formulation is flexible enough to incorporate different features and part configurations, so other interaction classes can be easily trained. Our research shows that there is still room for improvement. Most importantly, the temporal extent of the interaction is difficult to estimate precisely with our method because we train models on the epitome of the interaction, which covers only a small part of it." @default.
- W2950554015 created "2019-06-27" @default.
- W2950554015 creator A5001113391 @default.
- W2950554015 creator A5048594699 @default.
- W2950554015 date "2019-06-05" @default.
- W2950554015 modified "2023-09-27" @default.
- W2950554015 title "Modeling Dyadic Human Interactions : A study of methods for training pose+motion models of fine-grained face-to-face interactions in unsegmented videos" @default.
- W2950554015 hasPublicationYear "2019" @default.
- W2950554015 type Work @default.
- W2950554015 sameAs 2950554015 @default.
- W2950554015 citedByCount "0" @default.
- W2950554015 crossrefType "dissertation" @default.
- W2950554015 hasAuthorship W2950554015A5001113391 @default.
- W2950554015 hasAuthorship W2950554015A5048594699 @default.
- W2950554015 hasConcept C104114177 @default.
- W2950554015 hasConcept C107457646 @default.
- W2950554015 hasConcept C127413603 @default.
- W2950554015 hasConcept C144024400 @default.
- W2950554015 hasConcept C151319957 @default.
- W2950554015 hasConcept C154945302 @default.
- W2950554015 hasConcept C201995342 @default.
- W2950554015 hasConcept C2778000800 @default.
- W2950554015 hasConcept C2779304628 @default.
- W2950554015 hasConcept C2780451532 @default.
- W2950554015 hasConcept C2781238097 @default.
- W2950554015 hasConcept C31258907 @default.
- W2950554015 hasConcept C36289849 @default.
- W2950554015 hasConcept C41008148 @default.
- W2950554015 hasConceptScore W2950554015C104114177 @default.
- W2950554015 hasConceptScore W2950554015C107457646 @default.
- W2950554015 hasConceptScore W2950554015C127413603 @default.
- W2950554015 hasConceptScore W2950554015C144024400 @default.
- W2950554015 hasConceptScore W2950554015C151319957 @default.
- W2950554015 hasConceptScore W2950554015C154945302 @default.
- W2950554015 hasConceptScore W2950554015C201995342 @default.
- W2950554015 hasConceptScore W2950554015C2778000800 @default.
- W2950554015 hasConceptScore W2950554015C2779304628 @default.
- W2950554015 hasConceptScore W2950554015C2780451532 @default.
- W2950554015 hasConceptScore W2950554015C2781238097 @default.
- W2950554015 hasConceptScore W2950554015C31258907 @default.
- W2950554015 hasConceptScore W2950554015C36289849 @default.
- W2950554015 hasConceptScore W2950554015C41008148 @default.
- W2950554015 hasLocation W29505540151 @default.
- W2950554015 hasOpenAccess W2950554015 @default.
- W2950554015 hasPrimaryLocation W29505540151 @default.
- W2950554015 hasRelatedWork W1517303303 @default.
- W2950554015 hasRelatedWork W2205118440 @default.
- W2950554015 hasRelatedWork W2225735676 @default.
- W2950554015 hasRelatedWork W2261636736 @default.
- W2950554015 hasRelatedWork W2295605347 @default.
- W2950554015 hasRelatedWork W2611857641 @default.
- W2950554015 hasRelatedWork W2770604561 @default.
- W2950554015 hasRelatedWork W2890477007 @default.
- W2950554015 hasRelatedWork W2928892019 @default.
- W2950554015 hasRelatedWork W2937322566 @default.
- W2950554015 hasRelatedWork W2944059049 @default.
- W2950554015 hasRelatedWork W2949233999 @default.
- W2950554015 hasRelatedWork W2955233641 @default.
- W2950554015 hasRelatedWork W2973728204 @default.
- W2950554015 hasRelatedWork W3018812558 @default.
- W2950554015 hasRelatedWork W3022491485 @default.
- W2950554015 hasRelatedWork W3046844194 @default.
- W2950554015 hasRelatedWork W3111042471 @default.
- W2950554015 hasRelatedWork W3199628622 @default.
- W2950554015 hasRelatedWork W856339213 @default.
- W2950554015 isParatext "false" @default.
- W2950554015 isRetracted "false" @default.
- W2950554015 magId "2950554015" @default.
- W2950554015 workType "dissertation" @default.