Matches in SemOpenAlex for { <https://semopenalex.org/work/W3140469881> ?p ?o ?g. }
- W3140469881 endingPage "975" @default.
- W3140469881 startingPage "962" @default.
- W3140469881 abstract "High-quality magnetic resonance (MR) images afford more detailed information for reliable diagnoses and quantitative image analyses. Given low-resolution (LR) images, the deep convolutional neural network (CNN) has shown its promising ability for image super-resolution (SR). The LR MR images usually share some visual characteristics: structural textures of different sizes, edges with high correlation, and less informative background. However, multi-scale structural features are informative for image reconstruction, while the background is more smooth. Most previous CNN-based SR methods use a single receptive field and equally treat the spatial pixels (including the background). It neglects to sense the entire space and get diversified features from the input, which is critical for high-quality MR image SR. We propose a wide weighted attention multi-scale network ( <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$text{W}^{2}$ </tex-math></inline-formula> AMSN) for accurate MR image SR to address these problems. On the one hand, the features of varying sizes can be extracted by the wide multi-scale branches. On the other hand, we design a non-reduction attention mechanism to recalibrate feature responses adaptively. Such attention preserves continuous cross-channel interaction and focuses on more informative regions. Meanwhile, the learnable weighted factors fuse extracted features selectively. The encapsulated wide weighted attention multi-scale block ( <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$text{W}^{2}$ </tex-math></inline-formula> AMSB) is integrated through a recurrent framework and global attention mechanism. Extensive experiments and diversified ablation studies show the effectiveness of our proposed <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$text{W}^{2}$ </tex-math></inline-formula> AMSN, which surpasses state-of-the-art methods on most popular MR image SR benchmarks quantitatively and qualitatively. And our method still offers superior accuracy and adaptability on real MR images." @default.
- W3140469881 created "2021-04-13" @default.
- W3140469881 creator A5020093817 @default.
- W3140469881 creator A5028229824 @default.
- W3140469881 creator A5059112338 @default.
- W3140469881 creator A5074865219 @default.
- W3140469881 date "2022-03-01" @default.
- W3140469881 modified "2023-10-16" @default.
- W3140469881 title "Wide Weighted Attention Multi-Scale Network for Accurate MR Image Super-Resolution" @default.
- W3140469881 cites W1885185971 @default.
- W3140469881 cites W1965378579 @default.
- W3140469881 cites W1982029046 @default.
- W3140469881 cites W2003863798 @default.
- W3140469881 cites W2058523468 @default.
- W3140469881 cites W2074123321 @default.
- W3140469881 cites W2087380704 @default.
- W3140469881 cites W2088254198 @default.
- W3140469881 cites W2097074225 @default.
- W3140469881 cites W2097117768 @default.
- W3140469881 cites W2128272608 @default.
- W3140469881 cites W2133665775 @default.
- W3140469881 cites W2134584543 @default.
- W3140469881 cites W2214802144 @default.
- W3140469881 cites W2242218935 @default.
- W3140469881 cites W2295107390 @default.
- W3140469881 cites W2476548250 @default.
- W3140469881 cites W2498789492 @default.
- W3140469881 cites W2526558307 @default.
- W3140469881 cites W2549139847 @default.
- W3140469881 cites W2557668825 @default.
- W3140469881 cites W2565639579 @default.
- W3140469881 cites W2709402577 @default.
- W3140469881 cites W2747898905 @default.
- W3140469881 cites W2752782242 @default.
- W3140469881 cites W2768489488 @default.
- W3140469881 cites W2809226111 @default.
- W3140469881 cites W2954930822 @default.
- W3140469881 cites W2963125010 @default.
- W3140469881 cites W2963163009 @default.
- W3140469881 cites W2963372104 @default.
- W3140469881 cites W2963671574 @default.
- W3140469881 cites W2963986095 @default.
- W3140469881 cites W2964101377 @default.
- W3140469881 cites W2964297772 @default.
- W3140469881 cites W2971026149 @default.
- W3140469881 cites W2988452521 @default.
- W3140469881 cites W3018586778 @default.
- W3140469881 cites W3098848838 @default.
- W3140469881 cites W3101162162 @default.
- W3140469881 doi "https://doi.org/10.1109/tcsvt.2021.3070489" @default.
- W3140469881 hasPublicationYear "2022" @default.
- W3140469881 type Work @default.
- W3140469881 sameAs 3140469881 @default.
- W3140469881 citedByCount "9" @default.
- W3140469881 countsByYear W31404698812022 @default.
- W3140469881 countsByYear W31404698812023 @default.
- W3140469881 crossrefType "journal-article" @default.
- W3140469881 hasAuthorship W3140469881A5020093817 @default.
- W3140469881 hasAuthorship W3140469881A5028229824 @default.
- W3140469881 hasAuthorship W3140469881A5059112338 @default.
- W3140469881 hasAuthorship W3140469881A5074865219 @default.
- W3140469881 hasConcept C11413529 @default.
- W3140469881 hasConcept C114614502 @default.
- W3140469881 hasConcept C115961682 @default.
- W3140469881 hasConcept C121332964 @default.
- W3140469881 hasConcept C138885662 @default.
- W3140469881 hasConcept C153180895 @default.
- W3140469881 hasConcept C154945302 @default.
- W3140469881 hasConcept C160633673 @default.
- W3140469881 hasConcept C205372480 @default.
- W3140469881 hasConcept C2776401178 @default.
- W3140469881 hasConcept C2777210771 @default.
- W3140469881 hasConcept C2778755073 @default.
- W3140469881 hasConcept C33923547 @default.
- W3140469881 hasConcept C41008148 @default.
- W3140469881 hasConcept C41895202 @default.
- W3140469881 hasConcept C55020928 @default.
- W3140469881 hasConcept C62520636 @default.
- W3140469881 hasConcept C81363708 @default.
- W3140469881 hasConceptScore W3140469881C11413529 @default.
- W3140469881 hasConceptScore W3140469881C114614502 @default.
- W3140469881 hasConceptScore W3140469881C115961682 @default.
- W3140469881 hasConceptScore W3140469881C121332964 @default.
- W3140469881 hasConceptScore W3140469881C138885662 @default.
- W3140469881 hasConceptScore W3140469881C153180895 @default.
- W3140469881 hasConceptScore W3140469881C154945302 @default.
- W3140469881 hasConceptScore W3140469881C160633673 @default.
- W3140469881 hasConceptScore W3140469881C205372480 @default.
- W3140469881 hasConceptScore W3140469881C2776401178 @default.
- W3140469881 hasConceptScore W3140469881C2777210771 @default.
- W3140469881 hasConceptScore W3140469881C2778755073 @default.
- W3140469881 hasConceptScore W3140469881C33923547 @default.
- W3140469881 hasConceptScore W3140469881C41008148 @default.
- W3140469881 hasConceptScore W3140469881C41895202 @default.
- W3140469881 hasConceptScore W3140469881C55020928 @default.
- W3140469881 hasConceptScore W3140469881C62520636 @default.
- W3140469881 hasConceptScore W3140469881C81363708 @default.
- W3140469881 hasFunder F4320321001 @default.