Matches in SemOpenAlex for { <https://semopenalex.org/work/W4323644429> ?p ?o ?g. }
- W4323644429 endingPage "2118" @default.
- W4323644429 startingPage "2104" @default.
- W4323644429 abstract "Model-based gait recognition methods usually adopt the pedestrian walking postures to identify human beings. However, existing methods did not explicitly resolve the large intra-class variance of human pose due to changes in camera view. In this paper, we propose a lower-upper generative adversarial network (LUGAN) to generate multi-view pose sequences for each single-view sample to reduce the cross-view variance. Based on the prior of camera imaging, we prove that the spatial coordinates between cross-view poses satisfy a linear transformation of a full-rank matrix. Hence, LUGAN employs the adversarial training to learn full-rank transformation matrices from the source pose and target views to obtain the target pose sequences. The generator of LUGAN is composed of graph convolutional (GCN) layers, fully connected (FC) layers and two-branch convolutional (CNN) layers: GCN layers and FC layers encode the source pose sequence and target view, then CNN layers take as input the encoded features to learn a lower triangular matrix and an upper one, finally the transformation matrix is formulated by multiplying the lower and upper triangular matrices. For the purpose of adversarial training, we develop a conditional discriminator that distinguishes whether the pose sequence is true or generated. Furthermore, to facilitate the high-level correlation learning, we propose a plug-and-play module, named multi-scale hypergraph convolution (HGC), to replace the spatial graph convolutional layer in baseline, which can simultaneously model the joint-level, part-level and body-level correlations. Extensive experiments on three large gait recognition datasets (i.e., CASIA-B, OUMVLP-Pose and NLPR) demonstrate that our method outperforms the baseline model by a large margin." @default.
- W4323644429 created "2023-03-10" @default.
- W4323644429 creator A5000641290 @default.
- W4323644429 creator A5005345630 @default.
- W4323644429 creator A5023794124 @default.
- W4323644429 creator A5031480448 @default.
- W4323644429 creator A5036464034 @default.
- W4323644429 date "2023-01-01" @default.
- W4323644429 modified "2023-10-12" @default.
- W4323644429 title "Toward Complete-View and High-Level Pose-Based Gait Recognition" @default.
- W4323644429 cites W1564236611 @default.
- W4323644429 cites W1798125731 @default.
- W4323644429 cites W1987971958 @default.
- W4323644429 cites W2051224630 @default.
- W4323644429 cites W2064675550 @default.
- W4323644429 cites W2084111992 @default.
- W4323644429 cites W2115203491 @default.
- W4323644429 cites W2149516292 @default.
- W4323644429 cites W2154171558 @default.
- W4323644429 cites W2522764072 @default.
- W4323644429 cites W2559085405 @default.
- W4323644429 cites W2739325416 @default.
- W4323644429 cites W2765328347 @default.
- W4323644429 cites W2807461033 @default.
- W4323644429 cites W2808260522 @default.
- W4323644429 cites W2892880750 @default.
- W4323644429 cites W2916695504 @default.
- W4323644429 cites W2916798096 @default.
- W4323644429 cites W2945556069 @default.
- W4323644429 cites W2960957522 @default.
- W4323644429 cites W2962793481 @default.
- W4323644429 cites W2963076818 @default.
- W4323644429 cites W2963767194 @default.
- W4323644429 cites W2964186374 @default.
- W4323644429 cites W2972662547 @default.
- W4323644429 cites W2977530922 @default.
- W4323644429 cites W2996878574 @default.
- W4323644429 cites W3014370958 @default.
- W4323644429 cites W3014854053 @default.
- W4323644429 cites W3035050855 @default.
- W4323644429 cites W3035252826 @default.
- W4323644429 cites W3042230461 @default.
- W4323644429 cites W3085990079 @default.
- W4323644429 cites W3092754310 @default.
- W4323644429 cites W3093108399 @default.
- W4323644429 cites W3104352412 @default.
- W4323644429 cites W3127230993 @default.
- W4323644429 cites W3136525061 @default.
- W4323644429 cites W3182137715 @default.
- W4323644429 cites W3185393247 @default.
- W4323644429 cites W3192804777 @default.
- W4323644429 cites W3195852174 @default.
- W4323644429 cites W3201864842 @default.
- W4323644429 cites W3202075110 @default.
- W4323644429 cites W3203640349 @default.
- W4323644429 cites W3205717647 @default.
- W4323644429 cites W4205530192 @default.
- W4323644429 cites W4214538268 @default.
- W4323644429 cites W4226095747 @default.
- W4323644429 cites W4282972825 @default.
- W4323644429 cites W4283796380 @default.
- W4323644429 cites W4292793985 @default.
- W4323644429 cites W4313161900 @default.
- W4323644429 cites W4313419238 @default.
- W4323644429 doi "https://doi.org/10.1109/tifs.2023.3254449" @default.
- W4323644429 hasPublicationYear "2023" @default.
- W4323644429 type Work @default.
- W4323644429 citedByCount "2" @default.
- W4323644429 countsByYear W43236444292023 @default.
- W4323644429 crossrefType "journal-article" @default.
- W4323644429 hasAuthorship W4323644429A5000641290 @default.
- W4323644429 hasAuthorship W4323644429A5005345630 @default.
- W4323644429 hasAuthorship W4323644429A5023794124 @default.
- W4323644429 hasAuthorship W4323644429A5031480448 @default.
- W4323644429 hasAuthorship W4323644429A5036464034 @default.
- W4323644429 hasConcept C121332964 @default.
- W4323644429 hasConcept C132525143 @default.
- W4323644429 hasConcept C153180895 @default.
- W4323644429 hasConcept C154945302 @default.
- W4323644429 hasConcept C165443888 @default.
- W4323644429 hasConcept C2779803651 @default.
- W4323644429 hasConcept C31972630 @default.
- W4323644429 hasConcept C39920418 @default.
- W4323644429 hasConcept C41008148 @default.
- W4323644429 hasConcept C52102323 @default.
- W4323644429 hasConcept C74650414 @default.
- W4323644429 hasConcept C76155785 @default.
- W4323644429 hasConcept C80444323 @default.
- W4323644429 hasConcept C81363708 @default.
- W4323644429 hasConcept C94915269 @default.
- W4323644429 hasConceptScore W4323644429C121332964 @default.
- W4323644429 hasConceptScore W4323644429C132525143 @default.
- W4323644429 hasConceptScore W4323644429C153180895 @default.
- W4323644429 hasConceptScore W4323644429C154945302 @default.
- W4323644429 hasConceptScore W4323644429C165443888 @default.
- W4323644429 hasConceptScore W4323644429C2779803651 @default.
- W4323644429 hasConceptScore W4323644429C31972630 @default.
- W4323644429 hasConceptScore W4323644429C39920418 @default.