Matches in SemOpenAlex for { <https://semopenalex.org/work/W4221029930> ?p ?o ?g. }
Showing items 1 to 89 of
89
with 100 items per page.
- W4221029930 endingPage "103408" @default.
- W4221029930 startingPage "103408" @default.
- W4221029930 abstract "Fine-grained image classification is challenging due to the fact that categories can only be distinguished by subtle and local differences. Existing weakly supervised fine-grained image classification methods usually directly extract discriminative regions from the high-level feature maps. We observe that the operation of overlaying local receptive fields in the convolutional neural network makes the discriminative regions spread in the high-level feature maps, which can cause inaccurate region localization. In this paper, we propose an end-to-end Two-Level Attention Activation Model (TL-AAM), which can solve the problem of discriminative region spreading and obtain more effective fine-grained features. Specifically, the TL-AAM consists of: (1) an object attention activation module (OAAM), which links the correct classification score with the object region localization through gradient reflow to accurately localize the object region in a mutually reinforcing way, (2) a multi-scale pyramid attention localization module (MPALM), which locates local feature region by selecting the region with the largest response value in the feature channel, and this module can accurately obtain the detailed features in the local region, (3) a local cross-channel attention module (LCAM), which can filter irrelevant information in the high-level semantic feature maps by giving higher weights to the feature channels with high response values in the feature maps. Extensive experiments verify that TL-AAM yields the state-of-the-art performance under the same settings with the most competitive approaches, in CUB-200-2011, FGVC-Aircrafts, and Stanford-Cars datasets." @default.
- W4221029930 created "2022-04-03" @default.
- W4221029930 creator A5014818716 @default.
- W4221029930 creator A5021852079 @default.
- W4221029930 creator A5029630298 @default.
- W4221029930 date "2022-04-01" @default.
- W4221029930 modified "2023-09-27" @default.
- W4221029930 title "Weakly supervised fine-grained image classification via two-level attention activation model" @default.
- W4221029930 cites W1898560071 @default.
- W4221029930 cites W1980526845 @default.
- W4221029930 cites W2104657103 @default.
- W4221029930 cites W2118696714 @default.
- W4221029930 cites W2135706578 @default.
- W4221029930 cites W2138011018 @default.
- W4221029930 cites W2163922914 @default.
- W4221029930 cites W2190008860 @default.
- W4221029930 cites W2194775991 @default.
- W4221029930 cites W2295107390 @default.
- W4221029930 cites W2462457117 @default.
- W4221029930 cites W2554320282 @default.
- W4221029930 cites W2737725206 @default.
- W4221029930 cites W2740620254 @default.
- W4221029930 cites W2752782242 @default.
- W4221029930 cites W2755775504 @default.
- W4221029930 cites W2765268259 @default.
- W4221029930 cites W2765793020 @default.
- W4221029930 cites W2773003563 @default.
- W4221029930 cites W2780838211 @default.
- W4221029930 cites W2798365843 @default.
- W4221029930 cites W2807931652 @default.
- W4221029930 cites W2883502031 @default.
- W4221029930 cites W2891951760 @default.
- W4221029930 cites W2940925558 @default.
- W4221029930 cites W2951464224 @default.
- W4221029930 cites W2961018736 @default.
- W4221029930 cites W2962858109 @default.
- W4221029930 cites W2963090248 @default.
- W4221029930 cites W2963407932 @default.
- W4221029930 cites W2964036919 @default.
- W4221029930 cites W2981954115 @default.
- W4221029930 cites W2990495699 @default.
- W4221029930 cites W3009323895 @default.
- W4221029930 cites W3035220232 @default.
- W4221029930 cites W3035367622 @default.
- W4221029930 cites W3108870912 @default.
- W4221029930 cites W3123457861 @default.
- W4221029930 cites W3126558081 @default.
- W4221029930 cites W3128999341 @default.
- W4221029930 cites W3166091781 @default.
- W4221029930 cites W3195399086 @default.
- W4221029930 cites W56385144 @default.
- W4221029930 doi "https://doi.org/10.1016/j.cviu.2022.103408" @default.
- W4221029930 hasPublicationYear "2022" @default.
- W4221029930 type Work @default.
- W4221029930 citedByCount "4" @default.
- W4221029930 countsByYear W42210299302023 @default.
- W4221029930 crossrefType "journal-article" @default.
- W4221029930 hasAuthorship W4221029930A5014818716 @default.
- W4221029930 hasAuthorship W4221029930A5021852079 @default.
- W4221029930 hasAuthorship W4221029930A5029630298 @default.
- W4221029930 hasConcept C115961682 @default.
- W4221029930 hasConcept C153180895 @default.
- W4221029930 hasConcept C154945302 @default.
- W4221029930 hasConcept C31972630 @default.
- W4221029930 hasConcept C41008148 @default.
- W4221029930 hasConceptScore W4221029930C115961682 @default.
- W4221029930 hasConceptScore W4221029930C153180895 @default.
- W4221029930 hasConceptScore W4221029930C154945302 @default.
- W4221029930 hasConceptScore W4221029930C31972630 @default.
- W4221029930 hasConceptScore W4221029930C41008148 @default.
- W4221029930 hasLocation W42210299301 @default.
- W4221029930 hasOpenAccess W4221029930 @default.
- W4221029930 hasPrimaryLocation W42210299301 @default.
- W4221029930 hasRelatedWork W1891287906 @default.
- W4221029930 hasRelatedWork W1969923398 @default.
- W4221029930 hasRelatedWork W2036807459 @default.
- W4221029930 hasRelatedWork W2130228941 @default.
- W4221029930 hasRelatedWork W2161229648 @default.
- W4221029930 hasRelatedWork W2166024367 @default.
- W4221029930 hasRelatedWork W2755342338 @default.
- W4221029930 hasRelatedWork W2772917594 @default.
- W4221029930 hasRelatedWork W2775347418 @default.
- W4221029930 hasRelatedWork W2993674027 @default.
- W4221029930 hasVolume "218" @default.
- W4221029930 isParatext "false" @default.
- W4221029930 isRetracted "false" @default.
- W4221029930 workType "article" @default.