Matches in SemOpenAlex for { <https://semopenalex.org/work/W2866912866> ?p ?o ?g. }
- W2866912866 endingPage "2592" @default.
- W2866912866 startingPage "2578" @default.
- W2866912866 abstract "With the rapid increase in the amount of multimedia data, video classification has become a demanding and challenging research topic. Compared with image classification, video classification requires mapping a video that contains hundreds of frames to semantic tags, which poses many challenges to the direct use of advanced models originally designed for image-oriented tasks. On the other hand, continuous frames in a video also give us more visual clues that we can leverage to achieve better classification. One of the most important clues is the context in the spatiotemporal domain. In this paper, we introduce the multiscale deep alternative neural network (DANN), a novel architecture combining the strengths of both convolutional neural network and recurrent neural networks to achieve a deep network that can collect rich context hierarchies for video classification. In particular, the DANN is stacked with alternative layers, each of which consists of a volumetric convolutional layer followed by a recurrent layer. The former acts as a local feature learner, whereas the latter is used to collect contexts. Compared with popular deep feed-forward neural networks, the DANN learns local features and their contexts from the very beginning. This setting enables preserving context evolutions, which we show to be essential for improving the accuracy of video classification. To release the full potential of the DANN, we develop a deeper version with stochastic-layer skip-connections and construct a multiscale DANN to incorporate contexts at different scales. We show how to apply the multiscale DANN for video classification with carefully designed configurations in terms of both input-output settings and training-testing methods. The DANN is shown to be robust to not only human-centric videos, but also natural videos. As there are few large-scale natural disaster video datasets, we construct a new large-scale one and make it publicly available. Experiments on four datasets show the effectiveness of our method for both human actions and natural events." @default.
- W2866912866 created "2018-07-19" @default.
- W2866912866 creator A5017052768 @default.
- W2866912866 creator A5018478553 @default.
- W2866912866 creator A5084797829 @default.
- W2866912866 date "2018-10-01" @default.
- W2866912866 modified "2023-10-16" @default.
- W2866912866 title "Multiscale Deep Alternative Neural Network for Large-Scale Video Classification" @default.
- W2866912866 cites W1871385855 @default.
- W2866912866 cites W1983364832 @default.
- W2866912866 cites W2020163092 @default.
- W2866912866 cites W2027688900 @default.
- W2866912866 cites W2051657003 @default.
- W2866912866 cites W2058001207 @default.
- W2866912866 cites W2068611653 @default.
- W2866912866 cites W2071959927 @default.
- W2866912866 cites W2097947185 @default.
- W2866912866 cites W2109255472 @default.
- W2866912866 cites W2117539524 @default.
- W2866912866 cites W2137542411 @default.
- W2866912866 cites W2150355110 @default.
- W2866912866 cites W2175640409 @default.
- W2866912866 cites W2214422405 @default.
- W2866912866 cites W2235034809 @default.
- W2866912866 cites W2323437298 @default.
- W2866912866 cites W2589335264 @default.
- W2866912866 cites W2761434982 @default.
- W2866912866 cites W2784123366 @default.
- W2866912866 cites W3101203783 @default.
- W2866912866 doi "https://doi.org/10.1109/tmm.2018.2855081" @default.
- W2866912866 hasPublicationYear "2018" @default.
- W2866912866 type Work @default.
- W2866912866 sameAs 2866912866 @default.
- W2866912866 citedByCount "23" @default.
- W2866912866 countsByYear W28669128662019 @default.
- W2866912866 countsByYear W28669128662020 @default.
- W2866912866 countsByYear W28669128662021 @default.
- W2866912866 countsByYear W28669128662023 @default.
- W2866912866 crossrefType "journal-article" @default.
- W2866912866 hasAuthorship W2866912866A5017052768 @default.
- W2866912866 hasAuthorship W2866912866A5018478553 @default.
- W2866912866 hasAuthorship W2866912866A5084797829 @default.
- W2866912866 hasConcept C108583219 @default.
- W2866912866 hasConcept C115961682 @default.
- W2866912866 hasConcept C119857082 @default.
- W2866912866 hasConcept C138885662 @default.
- W2866912866 hasConcept C151730666 @default.
- W2866912866 hasConcept C153083717 @default.
- W2866912866 hasConcept C153180895 @default.
- W2866912866 hasConcept C154945302 @default.
- W2866912866 hasConcept C199360897 @default.
- W2866912866 hasConcept C2776401178 @default.
- W2866912866 hasConcept C2779343474 @default.
- W2866912866 hasConcept C2780801425 @default.
- W2866912866 hasConcept C2984842247 @default.
- W2866912866 hasConcept C41008148 @default.
- W2866912866 hasConcept C41895202 @default.
- W2866912866 hasConcept C50644808 @default.
- W2866912866 hasConcept C75294576 @default.
- W2866912866 hasConcept C81363708 @default.
- W2866912866 hasConcept C86803240 @default.
- W2866912866 hasConceptScore W2866912866C108583219 @default.
- W2866912866 hasConceptScore W2866912866C115961682 @default.
- W2866912866 hasConceptScore W2866912866C119857082 @default.
- W2866912866 hasConceptScore W2866912866C138885662 @default.
- W2866912866 hasConceptScore W2866912866C151730666 @default.
- W2866912866 hasConceptScore W2866912866C153083717 @default.
- W2866912866 hasConceptScore W2866912866C153180895 @default.
- W2866912866 hasConceptScore W2866912866C154945302 @default.
- W2866912866 hasConceptScore W2866912866C199360897 @default.
- W2866912866 hasConceptScore W2866912866C2776401178 @default.
- W2866912866 hasConceptScore W2866912866C2779343474 @default.
- W2866912866 hasConceptScore W2866912866C2780801425 @default.
- W2866912866 hasConceptScore W2866912866C2984842247 @default.
- W2866912866 hasConceptScore W2866912866C41008148 @default.
- W2866912866 hasConceptScore W2866912866C41895202 @default.
- W2866912866 hasConceptScore W2866912866C50644808 @default.
- W2866912866 hasConceptScore W2866912866C75294576 @default.
- W2866912866 hasConceptScore W2866912866C81363708 @default.
- W2866912866 hasConceptScore W2866912866C86803240 @default.
- W2866912866 hasFunder F4320321001 @default.
- W2866912866 hasFunder F4320335790 @default.
- W2866912866 hasIssue "10" @default.
- W2866912866 hasLocation W28669128661 @default.
- W2866912866 hasOpenAccess W2866912866 @default.
- W2866912866 hasPrimaryLocation W28669128661 @default.
- W2866912866 hasRelatedWork W2279398222 @default.
- W2866912866 hasRelatedWork W2766604260 @default.
- W2866912866 hasRelatedWork W2915754718 @default.
- W2866912866 hasRelatedWork W2986507176 @default.
- W2866912866 hasRelatedWork W2996856019 @default.
- W2866912866 hasRelatedWork W3011074480 @default.
- W2866912866 hasRelatedWork W3018421652 @default.
- W2866912866 hasRelatedWork W3160711233 @default.
- W2866912866 hasRelatedWork W4220996320 @default.
- W2866912866 hasRelatedWork W4299822940 @default.
- W2866912866 hasVolume "20" @default.
- W2866912866 isParatext "false" @default.