Matches in SemOpenAlex for { <https://semopenalex.org/work/W4386002400> ?p ?o ?g. }
Showing items 1 to 100 of
100
with 100 items per page.
- W4386002400 endingPage "109897" @default.
- W4386002400 startingPage "109897" @default.
- W4386002400 abstract "Learning-based image inpainting methods have made remarkable progress in recent years. Nevertheless, these methods still suffer from issues such as blurring, artifacts, and inconsistent contents. The use of vanilla convolution kernels, which have limited perceptual fields and spatially invariant kernel coefficients, is one of the main causes for these problems. In contrast, the multi-headed attention in the transformer can effectively model non-local relations among input features by generating adaptive attention scores. Therfore, this paper explores the feasibility of employing the transformer model for the image inpainting task. However, the multi-headed attention transformer blocks pose a significant challenge due to their overwhelming computational cost. To address this issue, we propose a novel U-Net style transformer-based network for the inpainting task, called the sparse self-attention transformer (Spa-former). The Spa-former retains the long-range modeling capacity of transformer blocks while reducing the computational burden. It incorporates a new channel attention approximation algorithm that reduces attention calculation to linear complexity. Additionally, it replaces the canonical softmax function with the ReLU function to generate a sparse attention map that effectively excludes irrelevant features. As a result, the Spa-former achieves effective long-range feature modeling with fewer parameters and lower computational resources. Our empirical results on challenging benchmarks demonstrate the superior performance of our proposed Spa-former over state-of-the-art approaches." @default.
- W4386002400 created "2023-08-20" @default.
- W4386002400 creator A5008562687 @default.
- W4386002400 creator A5034237194 @default.
- W4386002400 creator A5063678544 @default.
- W4386002400 creator A5078615880 @default.
- W4386002400 creator A5079119503 @default.
- W4386002400 creator A5079336892 @default.
- W4386002400 date "2024-01-01" @default.
- W4386002400 modified "2023-10-17" @default.
- W4386002400 title "Sparse self-attention transformer for image inpainting" @default.
- W4386002400 cites W1655403841 @default.
- W4386002400 cites W1993120651 @default.
- W4386002400 cites W2055132753 @default.
- W4386002400 cites W2963420272 @default.
- W4386002400 cites W2982083293 @default.
- W4386002400 cites W2982763192 @default.
- W4386002400 cites W2998075999 @default.
- W4386002400 cites W3035512475 @default.
- W4386002400 cites W3043547428 @default.
- W4386002400 cites W3118356434 @default.
- W4386002400 cites W3131500599 @default.
- W4386002400 cites W3138516171 @default.
- W4386002400 cites W3177318507 @default.
- W4386002400 cites W3199003182 @default.
- W4386002400 cites W3203538104 @default.
- W4386002400 cites W3206335650 @default.
- W4386002400 cites W3207918547 @default.
- W4386002400 cites W3213339063 @default.
- W4386002400 cites W3214312641 @default.
- W4386002400 cites W4200301490 @default.
- W4386002400 cites W4295419405 @default.
- W4386002400 cites W4296334467 @default.
- W4386002400 cites W4308737504 @default.
- W4386002400 cites W4309631284 @default.
- W4386002400 cites W4311901928 @default.
- W4386002400 cites W4313051036 @default.
- W4386002400 doi "https://doi.org/10.1016/j.patcog.2023.109897" @default.
- W4386002400 hasPublicationYear "2024" @default.
- W4386002400 type Work @default.
- W4386002400 citedByCount "0" @default.
- W4386002400 crossrefType "journal-article" @default.
- W4386002400 hasAuthorship W4386002400A5008562687 @default.
- W4386002400 hasAuthorship W4386002400A5034237194 @default.
- W4386002400 hasAuthorship W4386002400A5063678544 @default.
- W4386002400 hasAuthorship W4386002400A5078615880 @default.
- W4386002400 hasAuthorship W4386002400A5079119503 @default.
- W4386002400 hasAuthorship W4386002400A5079336892 @default.
- W4386002400 hasConcept C108583219 @default.
- W4386002400 hasConcept C11413529 @default.
- W4386002400 hasConcept C115961682 @default.
- W4386002400 hasConcept C11727466 @default.
- W4386002400 hasConcept C119599485 @default.
- W4386002400 hasConcept C119857082 @default.
- W4386002400 hasConcept C127413603 @default.
- W4386002400 hasConcept C153180895 @default.
- W4386002400 hasConcept C154945302 @default.
- W4386002400 hasConcept C165801399 @default.
- W4386002400 hasConcept C179799912 @default.
- W4386002400 hasConcept C188441871 @default.
- W4386002400 hasConcept C34736171 @default.
- W4386002400 hasConcept C41008148 @default.
- W4386002400 hasConcept C66322947 @default.
- W4386002400 hasConceptScore W4386002400C108583219 @default.
- W4386002400 hasConceptScore W4386002400C11413529 @default.
- W4386002400 hasConceptScore W4386002400C115961682 @default.
- W4386002400 hasConceptScore W4386002400C11727466 @default.
- W4386002400 hasConceptScore W4386002400C119599485 @default.
- W4386002400 hasConceptScore W4386002400C119857082 @default.
- W4386002400 hasConceptScore W4386002400C127413603 @default.
- W4386002400 hasConceptScore W4386002400C153180895 @default.
- W4386002400 hasConceptScore W4386002400C154945302 @default.
- W4386002400 hasConceptScore W4386002400C165801399 @default.
- W4386002400 hasConceptScore W4386002400C179799912 @default.
- W4386002400 hasConceptScore W4386002400C188441871 @default.
- W4386002400 hasConceptScore W4386002400C34736171 @default.
- W4386002400 hasConceptScore W4386002400C41008148 @default.
- W4386002400 hasConceptScore W4386002400C66322947 @default.
- W4386002400 hasFunder F4320321001 @default.
- W4386002400 hasFunder F4320321543 @default.
- W4386002400 hasFunder F4320335777 @default.
- W4386002400 hasLocation W43860024001 @default.
- W4386002400 hasOpenAccess W4386002400 @default.
- W4386002400 hasPrimaryLocation W43860024001 @default.
- W4386002400 hasRelatedWork W1504109132 @default.
- W4386002400 hasRelatedWork W2117562399 @default.
- W4386002400 hasRelatedWork W2213520135 @default.
- W4386002400 hasRelatedWork W2380775572 @default.
- W4386002400 hasRelatedWork W2894954915 @default.
- W4386002400 hasRelatedWork W2980176872 @default.
- W4386002400 hasRelatedWork W3107204728 @default.
- W4386002400 hasRelatedWork W3134074939 @default.
- W4386002400 hasRelatedWork W4226420367 @default.
- W4386002400 hasRelatedWork W4287591324 @default.
- W4386002400 hasVolume "145" @default.
- W4386002400 isParatext "false" @default.
- W4386002400 isRetracted "false" @default.
- W4386002400 workType "article" @default.