Matches in SemOpenAlex for { <https://semopenalex.org/work/W4361762631> ?p ?o ?g. }
Showing items 1 to 55 of
55
with 100 items per page.
- W4361762631 endingPage "619" @default.
- W4361762631 startingPage "609" @default.
- W4361762631 abstract "The Transformer backbone network with self-attention mechanism as the core has achieved great success in the field of natural language processing and computer vision. However, compared with classical visual feature extraction methods, the self-attention mechanism needs more training data to capture the relationship between tokens, which makes it challenging to train the transformer effectively on small datasets. We design a novel lightweight self-attention mechanism: Low-relation Multi-head Self-Attention (LMSA), which is superior to the recent self-attention and can fully explore the relationship between rare tokens. Specifically, the proposed self-attention mechanism breaks the barrier of the dimensional consistency of the traditional self-attention mechanism, making the feature relationship focus on a small number of dimensions, thereby reducing computational complexity and occupying less storage space. Experimental results show that the dimensional consistency inside the traditional self-attention mechanism is unnecessary. In particular, using Swin as the backbone model for training, the accuracy of the CIFAR-10 image classification task is improved by 0.43%, in the meanwhile, the consumption of a single self-attention resource is reduced by 64.58%, and the number of model parameters and model size is reduced by more than 15%. By appropriately condensing the self-attention relationship variables, the Transformer network can be more efficient and even perform better." @default.
- W4361762631 created "2023-04-04" @default.
- W4361762631 creator A5009364502 @default.
- W4361762631 creator A5020889685 @default.
- W4361762631 creator A5033828181 @default.
- W4361762631 creator A5053261798 @default.
- W4361762631 date "2023-01-01" @default.
- W4361762631 modified "2023-09-27" @default.
- W4361762631 title "LMSA: Low-Relation Multi-head Self-attention Mechanism in Visual Transformer" @default.
- W4361762631 cites W4313007769 @default.
- W4361762631 doi "https://doi.org/10.1007/978-981-99-0923-0_61" @default.
- W4361762631 hasPublicationYear "2023" @default.
- W4361762631 type Work @default.
- W4361762631 citedByCount "0" @default.
- W4361762631 crossrefType "book-chapter" @default.
- W4361762631 hasAuthorship W4361762631A5009364502 @default.
- W4361762631 hasAuthorship W4361762631A5020889685 @default.
- W4361762631 hasAuthorship W4361762631A5033828181 @default.
- W4361762631 hasAuthorship W4361762631A5053261798 @default.
- W4361762631 hasConcept C119599485 @default.
- W4361762631 hasConcept C119857082 @default.
- W4361762631 hasConcept C127413603 @default.
- W4361762631 hasConcept C153180895 @default.
- W4361762631 hasConcept C154945302 @default.
- W4361762631 hasConcept C165801399 @default.
- W4361762631 hasConcept C2776436953 @default.
- W4361762631 hasConcept C41008148 @default.
- W4361762631 hasConcept C66322947 @default.
- W4361762631 hasConceptScore W4361762631C119599485 @default.
- W4361762631 hasConceptScore W4361762631C119857082 @default.
- W4361762631 hasConceptScore W4361762631C127413603 @default.
- W4361762631 hasConceptScore W4361762631C153180895 @default.
- W4361762631 hasConceptScore W4361762631C154945302 @default.
- W4361762631 hasConceptScore W4361762631C165801399 @default.
- W4361762631 hasConceptScore W4361762631C2776436953 @default.
- W4361762631 hasConceptScore W4361762631C41008148 @default.
- W4361762631 hasConceptScore W4361762631C66322947 @default.
- W4361762631 hasLocation W43617626311 @default.
- W4361762631 hasOpenAccess W4361762631 @default.
- W4361762631 hasPrimaryLocation W43617626311 @default.
- W4361762631 hasRelatedWork W2350879319 @default.
- W4361762631 hasRelatedWork W2353865532 @default.
- W4361762631 hasRelatedWork W2961085424 @default.
- W4361762631 hasRelatedWork W3046775127 @default.
- W4361762631 hasRelatedWork W3107474891 @default.
- W4361762631 hasRelatedWork W4205958290 @default.
- W4361762631 hasRelatedWork W4286629047 @default.
- W4361762631 hasRelatedWork W4306321456 @default.
- W4361762631 hasRelatedWork W4306674287 @default.
- W4361762631 hasRelatedWork W4224009465 @default.
- W4361762631 isParatext "false" @default.
- W4361762631 isRetracted "false" @default.
- W4361762631 workType "book-chapter" @default.