Matches in SemOpenAlex for { <https://semopenalex.org/work/W4288365570> ?p ?o ?g. }
Showing items 1 to 70 of
70
with 100 items per page.
- W4288365570 abstract "Convolutional networks have been the paradigm of choice in many computer vision applications. The convolution operation however has a significant weakness in that it only operates on a local neighborhood, thus missing global information. Self-attention, on the other hand, has emerged as a recent advance to capture long range interactions, but has mostly been applied to sequence modeling and generative modeling tasks. In this paper, we consider the use of self-attention for discriminative visual tasks as an alternative to convolutions. We introduce a novel two-dimensional relative self-attention mechanism that proves competitive in replacing convolutions as a stand-alone computational primitive for image classification. We find in control experiments that the best results are obtained when combining both convolutions and self-attention. We therefore propose to augment convolutional operators with this self-attention mechanism by concatenating convolutional feature maps with a set of feature maps produced via self-attention. Extensive experiments show that Attention Augmentation leads to consistent improvements in image classification on ImageNet and object detection on COCO across many different models and scales, including ResNets and a state-of-the art mobile constrained network, while keeping the number of parameters similar. In particular, our method achieves a $1.3%$ top-1 accuracy improvement on ImageNet classification over a ResNet50 baseline and outperforms other attention mechanisms for images such as Squeeze-and-Excitation. It also achieves an improvement of 1.4 mAP in COCO Object Detection on top of a RetinaNet baseline." @default.
- W4288365570 created "2022-07-29" @default.
- W4288365570 creator A5026064427 @default.
- W4288365570 creator A5032356827 @default.
- W4288365570 creator A5050626617 @default.
- W4288365570 creator A5057073088 @default.
- W4288365570 creator A5088551093 @default.
- W4288365570 date "2019-04-22" @default.
- W4288365570 modified "2023-09-27" @default.
- W4288365570 title "Attention Augmented Convolutional Networks" @default.
- W4288365570 doi "https://doi.org/10.48550/arxiv.1904.09925" @default.
- W4288365570 hasPublicationYear "2019" @default.
- W4288365570 type Work @default.
- W4288365570 citedByCount "2" @default.
- W4288365570 countsByYear W42883655702023 @default.
- W4288365570 crossrefType "posted-content" @default.
- W4288365570 hasAuthorship W4288365570A5026064427 @default.
- W4288365570 hasAuthorship W4288365570A5032356827 @default.
- W4288365570 hasAuthorship W4288365570A5050626617 @default.
- W4288365570 hasAuthorship W4288365570A5057073088 @default.
- W4288365570 hasAuthorship W4288365570A5088551093 @default.
- W4288365570 hasBestOaLocation W42883655701 @default.
- W4288365570 hasConcept C115961682 @default.
- W4288365570 hasConcept C119857082 @default.
- W4288365570 hasConcept C138885662 @default.
- W4288365570 hasConcept C153180895 @default.
- W4288365570 hasConcept C154945302 @default.
- W4288365570 hasConcept C177264268 @default.
- W4288365570 hasConcept C199360897 @default.
- W4288365570 hasConcept C2776401178 @default.
- W4288365570 hasConcept C2781238097 @default.
- W4288365570 hasConcept C41008148 @default.
- W4288365570 hasConcept C41895202 @default.
- W4288365570 hasConcept C45347329 @default.
- W4288365570 hasConcept C50644808 @default.
- W4288365570 hasConcept C75294576 @default.
- W4288365570 hasConcept C81363708 @default.
- W4288365570 hasConcept C97931131 @default.
- W4288365570 hasConceptScore W4288365570C115961682 @default.
- W4288365570 hasConceptScore W4288365570C119857082 @default.
- W4288365570 hasConceptScore W4288365570C138885662 @default.
- W4288365570 hasConceptScore W4288365570C153180895 @default.
- W4288365570 hasConceptScore W4288365570C154945302 @default.
- W4288365570 hasConceptScore W4288365570C177264268 @default.
- W4288365570 hasConceptScore W4288365570C199360897 @default.
- W4288365570 hasConceptScore W4288365570C2776401178 @default.
- W4288365570 hasConceptScore W4288365570C2781238097 @default.
- W4288365570 hasConceptScore W4288365570C41008148 @default.
- W4288365570 hasConceptScore W4288365570C41895202 @default.
- W4288365570 hasConceptScore W4288365570C45347329 @default.
- W4288365570 hasConceptScore W4288365570C50644808 @default.
- W4288365570 hasConceptScore W4288365570C75294576 @default.
- W4288365570 hasConceptScore W4288365570C81363708 @default.
- W4288365570 hasConceptScore W4288365570C97931131 @default.
- W4288365570 hasLocation W42883655701 @default.
- W4288365570 hasOpenAccess W4288365570 @default.
- W4288365570 hasPrimaryLocation W42883655701 @default.
- W4288365570 hasRelatedWork W1914651075 @default.
- W4288365570 hasRelatedWork W1971623867 @default.
- W4288365570 hasRelatedWork W1982774199 @default.
- W4288365570 hasRelatedWork W2026121273 @default.
- W4288365570 hasRelatedWork W2295021132 @default.
- W4288365570 hasRelatedWork W2546942002 @default.
- W4288365570 hasRelatedWork W2554403468 @default.
- W4288365570 hasRelatedWork W2760085659 @default.
- W4288365570 hasRelatedWork W2971377935 @default.
- W4288365570 hasRelatedWork W3013138473 @default.
- W4288365570 isParatext "false" @default.
- W4288365570 isRetracted "false" @default.
- W4288365570 workType "article" @default.