Matches in SemOpenAlex for { <https://semopenalex.org/work/W3204826552> ?p ?o ?g. }
Showing items 1 to 93 of
93
with 100 items per page.
- W3204826552 abstract "Learning subtle representation about object parts plays a vital role in fine-grained visual recognition (FGVR) field. The vision transformer (ViT) achieves promising results on computer vision due to its attention mechanism. Nonetheless, with the fixed size of patches in ViT, the class token in deep layer focuses on the global receptive field and cannot generate multi-granularity features for FGVR. To capture region attention without box annotations and compensate for ViT shortcomings in FGVR, we propose a novel method named Adaptive attention multi-scale Fusion Transformer (AFTrans). The Selective Attention Collection Module (SACM) in our approach leverages attention weights in ViT and filters them adaptively to correspond with the relative importance of input patches. The multiple scales (global and local) pipeline is supervised by our weights sharing encoder and can be easily trained end-to-end. Comprehensive experiments demonstrate that AFTrans can achieve SOTA performance on three published fine-grained benchmarks: CUB-200-2011, Stanford Dogs and iNat2017." @default.
- W3204826552 created "2021-10-11" @default.
- W3204826552 creator A5012444471 @default.
- W3204826552 creator A5022586785 @default.
- W3204826552 creator A5030005269 @default.
- W3204826552 creator A5031767581 @default.
- W3204826552 creator A5038765418 @default.
- W3204826552 creator A5046198377 @default.
- W3204826552 creator A5065852561 @default.
- W3204826552 date "2022-05-23" @default.
- W3204826552 modified "2023-10-02" @default.
- W3204826552 title "A free lunch from ViT: adaptive attention multi-scale fusion Transformer for fine-grained visual recognition" @default.
- W3204826552 cites W2194775991 @default.
- W3204826552 cites W2202499615 @default.
- W3204826552 cites W2737725206 @default.
- W3204826552 cites W2740620254 @default.
- W3204826552 cites W2797977484 @default.
- W3204826552 cites W2798365843 @default.
- W3204826552 cites W2889469641 @default.
- W3204826552 cites W2963090248 @default.
- W3204826552 cites W2963393555 @default.
- W3204826552 cites W2963407932 @default.
- W3204826552 cites W2964350391 @default.
- W3204826552 cites W2986821660 @default.
- W3204826552 cites W2990495699 @default.
- W3204826552 cites W2998345525 @default.
- W3204826552 cites W2998619563 @default.
- W3204826552 cites W3034676907 @default.
- W3204826552 cites W3081907075 @default.
- W3204826552 cites W3096609285 @default.
- W3204826552 cites W3108870912 @default.
- W3204826552 cites W3206734547 @default.
- W3204826552 doi "https://doi.org/10.1109/icassp43922.2022.9747591" @default.
- W3204826552 hasPublicationYear "2022" @default.
- W3204826552 type Work @default.
- W3204826552 sameAs 3204826552 @default.
- W3204826552 citedByCount "8" @default.
- W3204826552 countsByYear W32048265522022 @default.
- W3204826552 countsByYear W32048265522023 @default.
- W3204826552 crossrefType "proceedings-article" @default.
- W3204826552 hasAuthorship W3204826552A5012444471 @default.
- W3204826552 hasAuthorship W3204826552A5022586785 @default.
- W3204826552 hasAuthorship W3204826552A5030005269 @default.
- W3204826552 hasAuthorship W3204826552A5031767581 @default.
- W3204826552 hasAuthorship W3204826552A5038765418 @default.
- W3204826552 hasAuthorship W3204826552A5046198377 @default.
- W3204826552 hasAuthorship W3204826552A5065852561 @default.
- W3204826552 hasBestOaLocation W32048265522 @default.
- W3204826552 hasConcept C111919701 @default.
- W3204826552 hasConcept C118505674 @default.
- W3204826552 hasConcept C119599485 @default.
- W3204826552 hasConcept C127413603 @default.
- W3204826552 hasConcept C153180895 @default.
- W3204826552 hasConcept C154945302 @default.
- W3204826552 hasConcept C165801399 @default.
- W3204826552 hasConcept C177774035 @default.
- W3204826552 hasConcept C31972630 @default.
- W3204826552 hasConcept C38652104 @default.
- W3204826552 hasConcept C41008148 @default.
- W3204826552 hasConcept C48145219 @default.
- W3204826552 hasConcept C66322947 @default.
- W3204826552 hasConceptScore W3204826552C111919701 @default.
- W3204826552 hasConceptScore W3204826552C118505674 @default.
- W3204826552 hasConceptScore W3204826552C119599485 @default.
- W3204826552 hasConceptScore W3204826552C127413603 @default.
- W3204826552 hasConceptScore W3204826552C153180895 @default.
- W3204826552 hasConceptScore W3204826552C154945302 @default.
- W3204826552 hasConceptScore W3204826552C165801399 @default.
- W3204826552 hasConceptScore W3204826552C177774035 @default.
- W3204826552 hasConceptScore W3204826552C31972630 @default.
- W3204826552 hasConceptScore W3204826552C38652104 @default.
- W3204826552 hasConceptScore W3204826552C41008148 @default.
- W3204826552 hasConceptScore W3204826552C48145219 @default.
- W3204826552 hasConceptScore W3204826552C66322947 @default.
- W3204826552 hasFunder F4320337504 @default.
- W3204826552 hasLocation W32048265521 @default.
- W3204826552 hasLocation W32048265522 @default.
- W3204826552 hasOpenAccess W3204826552 @default.
- W3204826552 hasPrimaryLocation W32048265521 @default.
- W3204826552 hasRelatedWork W2969574947 @default.
- W3204826552 hasRelatedWork W2972312591 @default.
- W3204826552 hasRelatedWork W3029282628 @default.
- W3204826552 hasRelatedWork W3037162118 @default.
- W3204826552 hasRelatedWork W3116268265 @default.
- W3204826552 hasRelatedWork W3177367299 @default.
- W3204826552 hasRelatedWork W3203212172 @default.
- W3204826552 hasRelatedWork W4285666859 @default.
- W3204826552 hasRelatedWork W4287775347 @default.
- W3204826552 hasRelatedWork W4288086191 @default.
- W3204826552 isParatext "false" @default.
- W3204826552 isRetracted "false" @default.
- W3204826552 magId "3204826552" @default.
- W3204826552 workType "article" @default.