Matches in SemOpenAlex for { <https://semopenalex.org/work/W3122048951> ?p ?o ?g. }
- W3122048951 endingPage "1" @default.
- W3122048951 startingPage "1" @default.
- W3122048951 abstract "Dynamic vision sensors (event cameras) have recently been introduced to solve a number of different vision tasks such as object recognition, activities recognition, tracking, etc. Compared with the traditional RGB sensors, the event cameras have many unique advantages such as ultra low resources consumption, high temporal resolution and much larger dynamic range. However, these cameras only produce noisy and asynchronous events of intensity changes, i.e., event-streams rather than frames, where conventional computer vision algorithms can't be directly applied. In our opinion the key challenge for improving the performance of event cameras in vision tasks is finding the appropriate representations of the event-streams so that cutting-edge learning approaches can be applied to fully uncover the spatio-temporal information contained in the event-streams. In this paper, we focus on the event-based human gait identification task and investigate the possible representations of the event-streams when deep neural networks are applied as the classifier. We propose new event-based gait recognition approaches basing on two different representations of the event-stream, i.e., graph and image-like representations, and use graph-based convolutional network (GCN) and convolutional neural networks (CNN) respectively to recognize gait from the event-streams. The two approaches are termed as EV-Gait-3DGraph and EV-Gait-IMG. To evaluate the performance of the proposed approaches, we collect two event-based gait datasets, one from real-world experiments and the other by converting the publicly available RGB gait recognition benchmark CASIA-B. Extensive experiments show that EV-Gait-3DGraph achieves significantly higher recognition accuracy than other competing methods when sufficient training samples are available. However, EV-Gait-IMG converges more quickly than graph-based approaches while training and shows good accuracy with only few number of training samples (less than ten). So image-like presentation is preferable when the amount of training data is limited." @default.
- W3122048951 created "2021-02-01" @default.
- W3122048951 creator A5013564110 @default.
- W3122048951 creator A5018910040 @default.
- W3122048951 creator A5031229914 @default.
- W3122048951 creator A5050600910 @default.
- W3122048951 creator A5053684989 @default.
- W3122048951 creator A5074999356 @default.
- W3122048951 creator A5081507644 @default.
- W3122048951 date "2021-01-01" @default.
- W3122048951 modified "2023-10-15" @default.
- W3122048951 title "Event-Stream Representation for Human Gaits Identification Using Deep Neural Networks" @default.
- W3122048951 cites W1487191044 @default.
- W3122048951 cites W1536680647 @default.
- W3122048951 cites W1971239273 @default.
- W3122048951 cites W1974775426 @default.
- W3122048951 cites W1980178290 @default.
- W3122048951 cites W1998548144 @default.
- W3122048951 cites W2016327746 @default.
- W3122048951 cites W2016574277 @default.
- W3122048951 cites W2018904782 @default.
- W3122048951 cites W2070304943 @default.
- W3122048951 cites W2091767348 @default.
- W3122048951 cites W2101491865 @default.
- W3122048951 cites W2126680226 @default.
- W3122048951 cites W2135541996 @default.
- W3122048951 cites W2137281791 @default.
- W3122048951 cites W2137474370 @default.
- W3122048951 cites W2139906443 @default.
- W3122048951 cites W2148568366 @default.
- W3122048951 cites W2149516292 @default.
- W3122048951 cites W2154624311 @default.
- W3122048951 cites W2194775991 @default.
- W3122048951 cites W2295370754 @default.
- W3122048951 cites W2322772590 @default.
- W3122048951 cites W2469278928 @default.
- W3122048951 cites W2507540959 @default.
- W3122048951 cites W2510190030 @default.
- W3122048951 cites W2517225990 @default.
- W3122048951 cites W2519559998 @default.
- W3122048951 cites W2530906228 @default.
- W3122048951 cites W2542803194 @default.
- W3122048951 cites W2558460151 @default.
- W3122048951 cites W2561528016 @default.
- W3122048951 cites W2606202972 @default.
- W3122048951 cites W2745933219 @default.
- W3122048951 cites W2766172911 @default.
- W3122048951 cites W2768308213 @default.
- W3122048951 cites W2769039400 @default.
- W3122048951 cites W2776622059 @default.
- W3122048951 cites W2788172931 @default.
- W3122048951 cites W2947930228 @default.
- W3122048951 cites W2948246283 @default.
- W3122048951 cites W2963510238 @default.
- W3122048951 cites W2979750740 @default.
- W3122048951 cites W2979969178 @default.
- W3122048951 cites W2981539886 @default.
- W3122048951 cites W2987391422 @default.
- W3122048951 cites W3034681945 @default.
- W3122048951 cites W3034739212 @default.
- W3122048951 cites W4210257598 @default.
- W3122048951 cites W4253803843 @default.
- W3122048951 doi "https://doi.org/10.1109/tpami.2021.3054886" @default.
- W3122048951 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/33502972" @default.
- W3122048951 hasPublicationYear "2021" @default.
- W3122048951 type Work @default.
- W3122048951 sameAs 3122048951 @default.
- W3122048951 citedByCount "14" @default.
- W3122048951 countsByYear W31220489512021 @default.
- W3122048951 countsByYear W31220489512022 @default.
- W3122048951 countsByYear W31220489512023 @default.
- W3122048951 crossrefType "journal-article" @default.
- W3122048951 hasAuthorship W3122048951A5013564110 @default.
- W3122048951 hasAuthorship W3122048951A5018910040 @default.
- W3122048951 hasAuthorship W3122048951A5031229914 @default.
- W3122048951 hasAuthorship W3122048951A5050600910 @default.
- W3122048951 hasAuthorship W3122048951A5053684989 @default.
- W3122048951 hasAuthorship W3122048951A5074999356 @default.
- W3122048951 hasAuthorship W3122048951A5081507644 @default.
- W3122048951 hasConcept C108583219 @default.
- W3122048951 hasConcept C121332964 @default.
- W3122048951 hasConcept C132525143 @default.
- W3122048951 hasConcept C13280743 @default.
- W3122048951 hasConcept C151800584 @default.
- W3122048951 hasConcept C153180895 @default.
- W3122048951 hasConcept C154945302 @default.
- W3122048951 hasConcept C185798385 @default.
- W3122048951 hasConcept C205649164 @default.
- W3122048951 hasConcept C2779662365 @default.
- W3122048951 hasConcept C31972630 @default.
- W3122048951 hasConcept C41008148 @default.
- W3122048951 hasConcept C42407357 @default.
- W3122048951 hasConcept C62520636 @default.
- W3122048951 hasConcept C80444323 @default.
- W3122048951 hasConcept C81363708 @default.
- W3122048951 hasConcept C82990744 @default.
- W3122048951 hasConcept C86803240 @default.
- W3122048951 hasConcept C95623464 @default.