Matches in SemOpenAlex for { <https://semopenalex.org/work/W4360833246> ?p ?o ?g. }
Showing items 1 to 54 of
54
with 100 items per page.
- W4360833246 abstract "Transformers have achieved widespread success in computer vision. At their heart, there is a Self-Attention (SA) mechanism, an inductive bias that associates each token in the input with every other token through a weighted basis. The standard SA mechanism has quadratic complexity with the sequence length, which impedes its utility to long sequences appearing in high resolution vision. Recently, inspired by operator learning for PDEs, Adaptive Fourier Neural Operators (AFNO) were introduced for high resolution attention based on global convolution that is efficiently implemented via FFT. However, the AFNO global filtering cannot well represent small and moderate scale structures that commonly appear in natural images. To leverage the coarse-to-fine scale structures we introduce a Multiscale Wavelet Attention (MWA) by leveraging wavelet neural operators which incurs linear complexity in the sequence size. We replace the attention in ViT with MWA and our experiments with CIFAR and Tiny-ImageNet classification demonstrate significant improvement over alternative Fourier-based attentions such as AFNO and Global Filter Network (GFN)." @default.
- W4360833246 created "2023-03-25" @default.
- W4360833246 creator A5053148644 @default.
- W4360833246 creator A5060755739 @default.
- W4360833246 creator A5080667264 @default.
- W4360833246 creator A5088601504 @default.
- W4360833246 date "2023-03-22" @default.
- W4360833246 modified "2023-10-16" @default.
- W4360833246 title "Multiscale Attention via Wavelet Neural Operators for Vision Transformers" @default.
- W4360833246 doi "https://doi.org/10.48550/arxiv.2303.12398" @default.
- W4360833246 hasPublicationYear "2023" @default.
- W4360833246 type Work @default.
- W4360833246 citedByCount "0" @default.
- W4360833246 crossrefType "posted-content" @default.
- W4360833246 hasAuthorship W4360833246A5053148644 @default.
- W4360833246 hasAuthorship W4360833246A5060755739 @default.
- W4360833246 hasAuthorship W4360833246A5080667264 @default.
- W4360833246 hasAuthorship W4360833246A5088601504 @default.
- W4360833246 hasBestOaLocation W43608332461 @default.
- W4360833246 hasConcept C11413529 @default.
- W4360833246 hasConcept C153083717 @default.
- W4360833246 hasConcept C153180895 @default.
- W4360833246 hasConcept C154945302 @default.
- W4360833246 hasConcept C31972630 @default.
- W4360833246 hasConcept C41008148 @default.
- W4360833246 hasConcept C47432892 @default.
- W4360833246 hasConcept C50644808 @default.
- W4360833246 hasConcept C75172450 @default.
- W4360833246 hasConceptScore W4360833246C11413529 @default.
- W4360833246 hasConceptScore W4360833246C153083717 @default.
- W4360833246 hasConceptScore W4360833246C153180895 @default.
- W4360833246 hasConceptScore W4360833246C154945302 @default.
- W4360833246 hasConceptScore W4360833246C31972630 @default.
- W4360833246 hasConceptScore W4360833246C41008148 @default.
- W4360833246 hasConceptScore W4360833246C47432892 @default.
- W4360833246 hasConceptScore W4360833246C50644808 @default.
- W4360833246 hasConceptScore W4360833246C75172450 @default.
- W4360833246 hasLocation W43608332461 @default.
- W4360833246 hasLocation W43608332462 @default.
- W4360833246 hasOpenAccess W4360833246 @default.
- W4360833246 hasPrimaryLocation W43608332461 @default.
- W4360833246 hasRelatedWork W1891287906 @default.
- W4360833246 hasRelatedWork W1969923398 @default.
- W4360833246 hasRelatedWork W2036807459 @default.
- W4360833246 hasRelatedWork W2166024367 @default.
- W4360833246 hasRelatedWork W2229312674 @default.
- W4360833246 hasRelatedWork W2541950815 @default.
- W4360833246 hasRelatedWork W2755342338 @default.
- W4360833246 hasRelatedWork W2772917594 @default.
- W4360833246 hasRelatedWork W2775347418 @default.
- W4360833246 hasRelatedWork W3116076068 @default.
- W4360833246 isParatext "false" @default.
- W4360833246 isRetracted "false" @default.
- W4360833246 workType "article" @default.