Matches in SemOpenAlex for { <https://semopenalex.org/work/W2605983441> ?p ?o ?g. }
- W2605983441 endingPage "982" @default.
- W2605983441 startingPage "970" @default.
- W2605983441 abstract "Video-based face, expression, and scene recognition are fundamental problems in human-machine interaction, especially when there is a short-length video. In this paper, we present a new derivative sparse representation approach for face and texture recognition using short-length videos. First, it builds local linear subspaces of dynamic texture segments by computing spatiotemporal directional derivatives in a cylinder neighborhood within dynamic textures. Unlike traditional methods, a nonbinary texture coding technique is proposed to extract high-order derivatives using continuous circular and cylinder regions to avoid aliasing effects. Then, these local linear subspaces of texture segments are mapped onto a Grassmann manifold via sparse representation. A new joint sparse representation algorithm is developed to establish the correspondences of subspace points on the manifold for measuring the similarity between two dynamic textures. Extensive experiments on the Honda/UCSD, the CMU motion of body, the YouTube, and the DynTex datasets show that the proposed method consistently outperforms the state-of-the-art methods in dynamic texture recognition, and achieved the encouraging highest accuracy reported to date on the challenging YouTube face dataset. The encouraging experimental results show the effectiveness of the proposed method in video-based face recognition in human-machine system applications." @default.
- W2605983441 created "2017-04-28" @default.
- W2605983441 creator A5039640213 @default.
- W2605983441 creator A5046099149 @default.
- W2605983441 creator A5054028848 @default.
- W2605983441 creator A5071549690 @default.
- W2605983441 creator A5089986388 @default.
- W2605983441 date "2017-12-01" @default.
- W2605983441 modified "2023-10-03" @default.
- W2605983441 title "Dynamic Texture Comparison Using Derivative Sparse Representation: Application to Video-Based Face Recognition" @default.
- W2605983441 cites W1506778248 @default.
- W2605983441 cites W1607985379 @default.
- W2605983441 cites W1847227265 @default.
- W2605983441 cites W1930406764 @default.
- W2605983441 cites W1964470356 @default.
- W2605983441 cites W1974774078 @default.
- W2605983441 cites W1976566382 @default.
- W2605983441 cites W1996939238 @default.
- W2605983441 cites W1999821839 @default.
- W2605983441 cites W2000771160 @default.
- W2605983441 cites W2019464758 @default.
- W2605983441 cites W2032070205 @default.
- W2605983441 cites W2034821857 @default.
- W2605983441 cites W2045512849 @default.
- W2605983441 cites W2066986622 @default.
- W2605983441 cites W2074054045 @default.
- W2605983441 cites W2079844951 @default.
- W2605983441 cites W2084406642 @default.
- W2605983441 cites W2097826102 @default.
- W2605983441 cites W2099111195 @default.
- W2605983441 cites W2105934661 @default.
- W2605983441 cites W2109786901 @default.
- W2605983441 cites W2114858244 @default.
- W2605983441 cites W2121821935 @default.
- W2605983441 cites W2122691893 @default.
- W2605983441 cites W2125838338 @default.
- W2605983441 cites W2131273085 @default.
- W2605983441 cites W2138451337 @default.
- W2605983441 cites W2139916508 @default.
- W2605983441 cites W2142705980 @default.
- W2605983441 cites W2144002622 @default.
- W2605983441 cites W2145287260 @default.
- W2605983441 cites W2146780613 @default.
- W2605983441 cites W2150600350 @default.
- W2605983441 cites W2152826865 @default.
- W2605983441 cites W2160144769 @default.
- W2605983441 cites W2162374132 @default.
- W2605983441 cites W2163352848 @default.
- W2605983441 cites W2163808566 @default.
- W2605983441 cites W2164696938 @default.
- W2605983441 cites W2165466912 @default.
- W2605983441 cites W2167978264 @default.
- W2605983441 cites W2168745915 @default.
- W2605983441 cites W2170282430 @default.
- W2605983441 cites W2293218418 @default.
- W2605983441 cites W2912155302 @default.
- W2605983441 cites W3097096317 @default.
- W2605983441 cites W4252684946 @default.
- W2605983441 doi "https://doi.org/10.1109/thms.2017.2681425" @default.
- W2605983441 hasPublicationYear "2017" @default.
- W2605983441 type Work @default.
- W2605983441 sameAs 2605983441 @default.
- W2605983441 citedByCount "19" @default.
- W2605983441 countsByYear W26059834412018 @default.
- W2605983441 countsByYear W26059834412019 @default.
- W2605983441 countsByYear W26059834412020 @default.
- W2605983441 countsByYear W26059834412021 @default.
- W2605983441 countsByYear W26059834412023 @default.
- W2605983441 crossrefType "journal-article" @default.
- W2605983441 hasAuthorship W2605983441A5039640213 @default.
- W2605983441 hasAuthorship W2605983441A5046099149 @default.
- W2605983441 hasAuthorship W2605983441A5054028848 @default.
- W2605983441 hasAuthorship W2605983441A5071549690 @default.
- W2605983441 hasAuthorship W2605983441A5089986388 @default.
- W2605983441 hasConcept C115961682 @default.
- W2605983441 hasConcept C12362212 @default.
- W2605983441 hasConcept C124066611 @default.
- W2605983441 hasConcept C144024400 @default.
- W2605983441 hasConcept C153180895 @default.
- W2605983441 hasConcept C154945302 @default.
- W2605983441 hasConcept C17744445 @default.
- W2605983441 hasConcept C199539241 @default.
- W2605983441 hasConcept C2524010 @default.
- W2605983441 hasConcept C2776359362 @default.
- W2605983441 hasConcept C2779304628 @default.
- W2605983441 hasConcept C2781195486 @default.
- W2605983441 hasConcept C31510193 @default.
- W2605983441 hasConcept C31972630 @default.
- W2605983441 hasConcept C32834561 @default.
- W2605983441 hasConcept C33923547 @default.
- W2605983441 hasConcept C36289849 @default.
- W2605983441 hasConcept C41008148 @default.
- W2605983441 hasConcept C94625758 @default.
- W2605983441 hasConceptScore W2605983441C115961682 @default.
- W2605983441 hasConceptScore W2605983441C12362212 @default.
- W2605983441 hasConceptScore W2605983441C124066611 @default.
- W2605983441 hasConceptScore W2605983441C144024400 @default.
- W2605983441 hasConceptScore W2605983441C153180895 @default.