Matches in SemOpenAlex for { <https://semopenalex.org/work/W4312778881> ?p ?o ?g. }
Showing items 1 to 87 of
87
with 100 items per page.
- W4312778881 endingPage "288" @default.
- W4312778881 startingPage "274" @default.
- W4312778881 abstract "Recently, vision transformers have achieved impressive success in computer vision tasks. Nevertheless, these models suffer from heavy computational cost for the quadratic complexity of the self-attention mechanism, especially when dealing with high-resolution images. Previous literature has illustrated the sparsity of attention, which suggests that uninformative tokens could be discarded to accelerate the model with limited influence to precision. As a natural indicator of token importance, attention scores can be intuitively used to extract the discriminative regions in images. Inspired by these facts, we propose an attention-based token pruning framework to address the issue of inefficiency for vision transformers. We divide the transformer blocks in the model into pruning stages, where the integrated weights in multi-attention heads are fused to estimate the importance of token. The computational cost of the model is reduced by dropping redundant patches progressively after each pruning stage. Experiments conducted on ImageNet1k verify the effectiveness of our method, where the models pruned by our module outperform other state-of-the-art models with similar FLOPs. For fine-grained image recognition, our framework also improves both accuracy and efficiency of ViT on CUB200-2011. More significantly, the proposed attention-based pruning module could be simply plugged in to any vision transformer that contains the class token by fine-tuning only 10 epochs or a single epoch, making a reasonable trade-off between accuracy and cost." @default.
- W4312778881 created "2023-01-05" @default.
- W4312778881 creator A5017613139 @default.
- W4312778881 creator A5017968460 @default.
- W4312778881 creator A5054339696 @default.
- W4312778881 creator A5076835782 @default.
- W4312778881 date "2022-01-01" @default.
- W4312778881 modified "2023-10-18" @default.
- W4312778881 title "An Attention-Based Token Pruning Method for Vision Transformers" @default.
- W4312778881 cites W2108598243 @default.
- W4312778881 cites W2194775991 @default.
- W4312778881 cites W2895643041 @default.
- W4312778881 cites W2915716523 @default.
- W4312778881 cites W2962851801 @default.
- W4312778881 cites W2963363373 @default.
- W4312778881 cites W2963393555 @default.
- W4312778881 cites W2997426000 @default.
- W4312778881 cites W2998345525 @default.
- W4312778881 cites W2998619563 @default.
- W4312778881 cites W3034429256 @default.
- W4312778881 cites W3108870912 @default.
- W4312778881 cites W3138516171 @default.
- W4312778881 cites W3176196997 @default.
- W4312778881 cites W4214713996 @default.
- W4312778881 doi "https://doi.org/10.1007/978-3-031-21244-4_21" @default.
- W4312778881 hasPublicationYear "2022" @default.
- W4312778881 type Work @default.
- W4312778881 citedByCount "0" @default.
- W4312778881 crossrefType "book-chapter" @default.
- W4312778881 hasAuthorship W4312778881A5017613139 @default.
- W4312778881 hasAuthorship W4312778881A5017968460 @default.
- W4312778881 hasAuthorship W4312778881A5054339696 @default.
- W4312778881 hasAuthorship W4312778881A5076835782 @default.
- W4312778881 hasConcept C108010975 @default.
- W4312778881 hasConcept C119857082 @default.
- W4312778881 hasConcept C121332964 @default.
- W4312778881 hasConcept C154945302 @default.
- W4312778881 hasConcept C162324750 @default.
- W4312778881 hasConcept C165801399 @default.
- W4312778881 hasConcept C173608175 @default.
- W4312778881 hasConcept C175444787 @default.
- W4312778881 hasConcept C2778869765 @default.
- W4312778881 hasConcept C3826847 @default.
- W4312778881 hasConcept C38652104 @default.
- W4312778881 hasConcept C41008148 @default.
- W4312778881 hasConcept C48145219 @default.
- W4312778881 hasConcept C62520636 @default.
- W4312778881 hasConcept C6557445 @default.
- W4312778881 hasConcept C66322947 @default.
- W4312778881 hasConcept C86803240 @default.
- W4312778881 hasConcept C97931131 @default.
- W4312778881 hasConceptScore W4312778881C108010975 @default.
- W4312778881 hasConceptScore W4312778881C119857082 @default.
- W4312778881 hasConceptScore W4312778881C121332964 @default.
- W4312778881 hasConceptScore W4312778881C154945302 @default.
- W4312778881 hasConceptScore W4312778881C162324750 @default.
- W4312778881 hasConceptScore W4312778881C165801399 @default.
- W4312778881 hasConceptScore W4312778881C173608175 @default.
- W4312778881 hasConceptScore W4312778881C175444787 @default.
- W4312778881 hasConceptScore W4312778881C2778869765 @default.
- W4312778881 hasConceptScore W4312778881C3826847 @default.
- W4312778881 hasConceptScore W4312778881C38652104 @default.
- W4312778881 hasConceptScore W4312778881C41008148 @default.
- W4312778881 hasConceptScore W4312778881C48145219 @default.
- W4312778881 hasConceptScore W4312778881C62520636 @default.
- W4312778881 hasConceptScore W4312778881C6557445 @default.
- W4312778881 hasConceptScore W4312778881C66322947 @default.
- W4312778881 hasConceptScore W4312778881C86803240 @default.
- W4312778881 hasConceptScore W4312778881C97931131 @default.
- W4312778881 hasLocation W43127788811 @default.
- W4312778881 hasOpenAccess W4312778881 @default.
- W4312778881 hasPrimaryLocation W43127788811 @default.
- W4312778881 hasRelatedWork W110210027 @default.
- W4312778881 hasRelatedWork W2353457699 @default.
- W4312778881 hasRelatedWork W3139434170 @default.
- W4312778881 hasRelatedWork W3199608561 @default.
- W4312778881 hasRelatedWork W3215545016 @default.
- W4312778881 hasRelatedWork W4226191776 @default.
- W4312778881 hasRelatedWork W4281485846 @default.
- W4312778881 hasRelatedWork W4288099773 @default.
- W4312778881 hasRelatedWork W4303201644 @default.
- W4312778881 hasRelatedWork W4312872526 @default.
- W4312778881 isParatext "false" @default.
- W4312778881 isRetracted "false" @default.
- W4312778881 workType "book-chapter" @default.