Matches in SemOpenAlex for { <https://semopenalex.org/work/W2601350280> ?p ?o ?g. }
Showing items 1 to 55 of
55
with 100 items per page.
- W2601350280 abstract "Early human action detection is an important computer vision task with a widespectrum of potential applications. Most existing methods deal with the detection of anaction after its completion. Contrarily, for early detection it is essential to detect an actionas early as possible. Therefore, this thesis develops a solution to detect ongoing humanaction as soon as it begins, but before it finishes.In order to perform early human action detection, the conventional classificationproblem is modified into frame-by-frame level classification. There exists well-knownclassifiers such as Support Vector Machines (SVM), K-nearest Neighbour (KNN), etc. toperform action classification. However, the employability of these algorithms dependson the desired application and its requirements. Therefore, selection of the classifier toemploy for the classification task is an important issue to be taken into account. Thefirst part of the thesis studies this problem and fuzzy Bandler-Kohout (BK) sub-triangleproduct (subproduct) is employed as a classifier. The performance is tested for humanaction recognition and scene classification. This is a crucial step as it is the first attemptof using fuzzy BK subproduct for classification.The second part of this thesis studies the problem of early human action detection.The method proposed is based on fuzzy BK subproduct inference mechanism and utilizesthe fuzzy capabilities in handling the uncertainties that exist in the real-world for reliabledecision making. The fuzzy membership function generated frame-by-frame from fuzzyBK subproduct provides the basis to detect an action before it is completed, when a certainthreshold is attained in a suitable way. In order to test the effectiveness of the proposedframework, a set of experiments is performed for few action sequences where the detectoris able to recognize an action upon seeing �32% of the frames.iiiFinally, the proposed method is analyzed from a broader perspective and a hybridtechnique for early anticipation of human action is proposed. It combines the benefits ofcomputer vision and fuzzy set theory based on fuzzy BK subproduct. The novelty liesin the construction of a frame-by-frame membership function for each kind of possiblemovement, taking into account several human actions from a publicly available dataset.Furthermore, the impact of various fuzzy implication operators and inference structuresin retrieving the relationship between the human subject and the actions performed isdiscussed. The existing fuzzy implication operators are capable of handling only two dimensionaldata. A third dimension ‘time’ plays a crucial role in human action recognitionto model the human movement changes over time. Therefore, a new space-time fuzzyimplication operator is introduced, by modifying the existing implication operators toaccommodate time as an added dimension. Empirically, the proposed hybrid techniqueis efficiently able to detect an action before completion and outperform the conventionalsolutions with good detection rate. The detector is able to identify an action upon viewing�23% of the frames on an average." @default.
- W2601350280 created "2017-04-07" @default.
- W2601350280 creator A5002543881 @default.
- W2601350280 date "2016-01-01" @default.
- W2601350280 modified "2023-09-24" @default.
- W2601350280 title "A fuzzy approach for early human action detection / Ekta Vats" @default.
- W2601350280 hasPublicationYear "2016" @default.
- W2601350280 type Work @default.
- W2601350280 sameAs 2601350280 @default.
- W2601350280 citedByCount "0" @default.
- W2601350280 crossrefType "dissertation" @default.
- W2601350280 hasAuthorship W2601350280A5002543881 @default.
- W2601350280 hasConcept C119857082 @default.
- W2601350280 hasConcept C12267149 @default.
- W2601350280 hasConcept C124101348 @default.
- W2601350280 hasConcept C153180895 @default.
- W2601350280 hasConcept C154945302 @default.
- W2601350280 hasConcept C41008148 @default.
- W2601350280 hasConcept C58166 @default.
- W2601350280 hasConcept C95623464 @default.
- W2601350280 hasConceptScore W2601350280C119857082 @default.
- W2601350280 hasConceptScore W2601350280C12267149 @default.
- W2601350280 hasConceptScore W2601350280C124101348 @default.
- W2601350280 hasConceptScore W2601350280C153180895 @default.
- W2601350280 hasConceptScore W2601350280C154945302 @default.
- W2601350280 hasConceptScore W2601350280C41008148 @default.
- W2601350280 hasConceptScore W2601350280C58166 @default.
- W2601350280 hasConceptScore W2601350280C95623464 @default.
- W2601350280 hasLocation W26013502801 @default.
- W2601350280 hasOpenAccess W2601350280 @default.
- W2601350280 hasPrimaryLocation W26013502801 @default.
- W2601350280 hasRelatedWork W1095282497 @default.
- W2601350280 hasRelatedWork W142631372 @default.
- W2601350280 hasRelatedWork W1485754829 @default.
- W2601350280 hasRelatedWork W1494162013 @default.
- W2601350280 hasRelatedWork W1505678310 @default.
- W2601350280 hasRelatedWork W1538992051 @default.
- W2601350280 hasRelatedWork W1559106582 @default.
- W2601350280 hasRelatedWork W2000921164 @default.
- W2601350280 hasRelatedWork W2049822318 @default.
- W2601350280 hasRelatedWork W2071935478 @default.
- W2601350280 hasRelatedWork W2073285221 @default.
- W2601350280 hasRelatedWork W2096636765 @default.
- W2601350280 hasRelatedWork W2097921769 @default.
- W2601350280 hasRelatedWork W2110372549 @default.
- W2601350280 hasRelatedWork W2110630940 @default.
- W2601350280 hasRelatedWork W2111736919 @default.
- W2601350280 hasRelatedWork W2126848093 @default.
- W2601350280 hasRelatedWork W2180314839 @default.
- W2601350280 hasRelatedWork W3179836678 @default.
- W2601350280 hasRelatedWork W66802997 @default.
- W2601350280 isParatext "false" @default.
- W2601350280 isRetracted "false" @default.
- W2601350280 magId "2601350280" @default.
- W2601350280 workType "dissertation" @default.