Matches in SemOpenAlex for { <https://semopenalex.org/work/W2180092181> ?p ?o ?g. }
- W2180092181 abstract "We propose an approach to learn spatio-temporal features in videos from intermediate visual representations we call using Gated-Recurrent-Unit Recurrent Networks (GRUs).Our method relies on percepts that are extracted from all level of a deep convolutional network trained on the large ImageNet dataset. While high-level percepts contain highly discriminative information, they tend to have a low-spatial resolution. Low-level percepts, on the other hand, preserve a higher spatial resolution from which we can model finer motion patterns. Using low-level percepts can leads to high-dimensionality video representations. To mitigate this effect and control the model number of parameters, we introduce a variant of the GRU model that leverages the convolution operations to enforce sparse connectivity of the model units and share parameters across the input spatial locations. We empirically validate our approach on both Human Action Recognition and Video Captioning tasks. In particular, we achieve results equivalent to state-of-art on the YouTube2Text dataset using a simpler text-decoder model and without extra 3D CNN features." @default.
- W2180092181 created "2016-06-24" @default.
- W2180092181 creator A5020949970 @default.
- W2180092181 creator A5031185465 @default.
- W2180092181 creator A5051020422 @default.
- W2180092181 creator A5057065873 @default.
- W2180092181 date "2015-11-19" @default.
- W2180092181 modified "2023-10-01" @default.
- W2180092181 title "Delving Deeper into Convolutional Networks for Learning Video Representations" @default.
- W2180092181 cites W1522301498 @default.
- W2180092181 cites W1606347560 @default.
- W2180092181 cites W1686810756 @default.
- W2180092181 cites W1924770834 @default.
- W2180092181 cites W1947481528 @default.
- W2180092181 cites W2064675550 @default.
- W2180092181 cites W2097998348 @default.
- W2180092181 cites W2101105183 @default.
- W2180092181 cites W2102113734 @default.
- W2180092181 cites W2112796928 @default.
- W2180092181 cites W2126574503 @default.
- W2180092181 cites W2133459682 @default.
- W2180092181 cites W2164290393 @default.
- W2180092181 cites W2251353663 @default.
- W2180092181 cites W2273041409 @default.
- W2180092181 cites W2308045930 @default.
- W2180092181 cites W24089286 @default.
- W2180092181 cites W2950179405 @default.
- W2180092181 cites W2950307714 @default.
- W2180092181 cites W2950635152 @default.
- W2180092181 cites W2951183276 @default.
- W2180092181 cites W2951893483 @default.
- W2180092181 cites W2952186347 @default.
- W2180092181 cites W2952453038 @default.
- W2180092181 cites W2952574180 @default.
- W2180092181 cites W2952633803 @default.
- W2180092181 cites W2953111739 @default.
- W2180092181 cites W2953118818 @default.
- W2180092181 cites W2964308564 @default.
- W2180092181 cites W6908809 @default.
- W2180092181 cites W787785461 @default.
- W2180092181 hasPublicationYear "2015" @default.
- W2180092181 type Work @default.
- W2180092181 sameAs 2180092181 @default.
- W2180092181 citedByCount "54" @default.
- W2180092181 countsByYear W21800921812016 @default.
- W2180092181 countsByYear W21800921812017 @default.
- W2180092181 countsByYear W21800921812018 @default.
- W2180092181 countsByYear W21800921812019 @default.
- W2180092181 countsByYear W21800921812020 @default.
- W2180092181 countsByYear W21800921812021 @default.
- W2180092181 crossrefType "posted-content" @default.
- W2180092181 hasAuthorship W2180092181A5020949970 @default.
- W2180092181 hasAuthorship W2180092181A5031185465 @default.
- W2180092181 hasAuthorship W2180092181A5051020422 @default.
- W2180092181 hasAuthorship W2180092181A5057065873 @default.
- W2180092181 hasConcept C111030470 @default.
- W2180092181 hasConcept C115961682 @default.
- W2180092181 hasConcept C153180895 @default.
- W2180092181 hasConcept C154945302 @default.
- W2180092181 hasConcept C157657479 @default.
- W2180092181 hasConcept C2777212361 @default.
- W2180092181 hasConcept C2987834672 @default.
- W2180092181 hasConcept C41008148 @default.
- W2180092181 hasConcept C45347329 @default.
- W2180092181 hasConcept C50644808 @default.
- W2180092181 hasConcept C81363708 @default.
- W2180092181 hasConcept C97931131 @default.
- W2180092181 hasConceptScore W2180092181C111030470 @default.
- W2180092181 hasConceptScore W2180092181C115961682 @default.
- W2180092181 hasConceptScore W2180092181C153180895 @default.
- W2180092181 hasConceptScore W2180092181C154945302 @default.
- W2180092181 hasConceptScore W2180092181C157657479 @default.
- W2180092181 hasConceptScore W2180092181C2777212361 @default.
- W2180092181 hasConceptScore W2180092181C2987834672 @default.
- W2180092181 hasConceptScore W2180092181C41008148 @default.
- W2180092181 hasConceptScore W2180092181C45347329 @default.
- W2180092181 hasConceptScore W2180092181C50644808 @default.
- W2180092181 hasConceptScore W2180092181C81363708 @default.
- W2180092181 hasConceptScore W2180092181C97931131 @default.
- W2180092181 hasLocation W21800921811 @default.
- W2180092181 hasOpenAccess W2180092181 @default.
- W2180092181 hasPrimaryLocation W21800921811 @default.
- W2180092181 hasRelatedWork W1522301498 @default.
- W2180092181 hasRelatedWork W1522734439 @default.
- W2180092181 hasRelatedWork W1686810756 @default.
- W2180092181 hasRelatedWork W1901129140 @default.
- W2180092181 hasRelatedWork W1924770834 @default.
- W2180092181 hasRelatedWork W1947481528 @default.
- W2180092181 hasRelatedWork W2064675550 @default.
- W2180092181 hasRelatedWork W2097117768 @default.
- W2180092181 hasRelatedWork W2101105183 @default.
- W2180092181 hasRelatedWork W2102605133 @default.
- W2180092181 hasRelatedWork W2117539524 @default.
- W2180092181 hasRelatedWork W2157331557 @default.
- W2180092181 hasRelatedWork W2163605009 @default.
- W2180092181 hasRelatedWork W2164290393 @default.
- W2180092181 hasRelatedWork W2194775991 @default.
- W2180092181 hasRelatedWork W2308045930 @default.
- W2180092181 hasRelatedWork W24089286 @default.
- W2180092181 hasRelatedWork W2949117887 @default.