Matches in SemOpenAlex for { <https://semopenalex.org/work/W4296841186> ?p ?o ?g. }
- W4296841186 endingPage "4950" @default.
- W4296841186 startingPage "4911" @default.
- W4296841186 abstract "ABSTRACTPansharpening belongs to an important part in the field of remote sensing image fusion, which refers to the fusion of low spatial resolution multispectral (MS) images with high spatial resolution (PAN) images to finally obtain high resolution multispectral (HRMS) images. The existing deep learning-based pan-sharpening methods have achieved better results compared with the traditional methods, but there are still two problems: spectral distortion and loss of spatial detail information. We propose an end-to-end attention-based dual-residual multi-stage remote sensing image fusion network (ADRPN) for MS image and PAN image pan-sharpening based on an in-depth study of spectral information of MS images and spatial information fusion of PAN images, which consists of three main stages, each resembling an encoder-decoder, and cross-stage fusion to achieve channel domain feature stitching. The first two stages are the feature extraction stage, where the features of MS images and PAN images are extracted using the residual module, and the different features learned are used to guide the training of individual networks using CRFB (Cross residual feature block). In the third stage (image reconstruction), we use the CA (Coordinate attention) and SE (Squeeze-and-Excitation) attention mechanism to enable the network to more precisely locate the region of interest by the precise location information obtained, which allows the features extracted in the first two stages to be better fused with the original image, thus reducing the occurrence of spectral distortion and loss of spatial detail information. Qualitative and quantitative analyses of real and simulated data from the benchmark datasets QuickBird (QB), GF-2, and WorldView-2 (WV2) show that the method can better preserve the spectral and spatial detail information and obtain high-quality HRMS images.KEYWORDS: Pansharpeningmultispectral (MS) imagespanchromatic (PAN) imagesresidual blockencoder-decoderattentionCRFB Disclosure statementThe authors declare that no potential competing interests exist. There is no an undisclosed relationship that may pose a competing interest. There is no an undisclosed funding source that may pose a competing interest.Additional informationFundingThis research was supported by the National Natural Science Foundation of China (61772319, 62002200, 61972235 and 12001327), Shandong Natural Science Foundation of China (ZR2021QF134 and ZR2021MF068), and Yantai science and technology innovation development plan (2022JCYJ031)" @default.
- W4296841186 created "2022-09-24" @default.
- W4296841186 creator A5028836596 @default.
- W4296841186 creator A5046173951 @default.
- W4296841186 creator A5078116487 @default.
- W4296841186 date "2022-07-03" @default.
- W4296841186 modified "2023-10-10" @default.
- W4296841186 title "Attention-based dual residual network-based for multi-spectral pan-sharpening" @default.
- W4296841186 cites W1885185971 @default.
- W4296841186 cites W1901129140 @default.
- W4296841186 cites W1980110630 @default.
- W4296841186 cites W2000323021 @default.
- W4296841186 cites W2111924917 @default.
- W4296841186 cites W2113338111 @default.
- W4296841186 cites W2144436897 @default.
- W4296841186 cites W2149720806 @default.
- W4296841186 cites W2171211028 @default.
- W4296841186 cites W2194775991 @default.
- W4296841186 cites W2249464603 @default.
- W4296841186 cites W2414425402 @default.
- W4296841186 cites W2462592242 @default.
- W4296841186 cites W2474240800 @default.
- W4296841186 cites W2486820882 @default.
- W4296841186 cites W2606366933 @default.
- W4296841186 cites W2618530766 @default.
- W4296841186 cites W2619662254 @default.
- W4296841186 cites W2737219040 @default.
- W4296841186 cites W2752782242 @default.
- W4296841186 cites W2777033955 @default.
- W4296841186 cites W2782522152 @default.
- W4296841186 cites W2792217524 @default.
- W4296841186 cites W2884585870 @default.
- W4296841186 cites W2899594121 @default.
- W4296841186 cites W2921660688 @default.
- W4296841186 cites W2953299370 @default.
- W4296841186 cites W2955058313 @default.
- W4296841186 cites W2963091558 @default.
- W4296841186 cites W2963183385 @default.
- W4296841186 cites W2963800716 @default.
- W4296841186 cites W2964275574 @default.
- W4296841186 cites W2982220924 @default.
- W4296841186 cites W3009431086 @default.
- W4296841186 cites W3013082057 @default.
- W4296841186 cites W3023991509 @default.
- W4296841186 cites W3034502973 @default.
- W4296841186 cites W3034752215 @default.
- W4296841186 cites W3035731588 @default.
- W4296841186 cites W3081217221 @default.
- W4296841186 cites W3090974769 @default.
- W4296841186 cites W3097824737 @default.
- W4296841186 cites W3098542449 @default.
- W4296841186 cites W3099258777 @default.
- W4296841186 cites W3106250896 @default.
- W4296841186 cites W3115223653 @default.
- W4296841186 cites W3133388191 @default.
- W4296841186 cites W3176997885 @default.
- W4296841186 cites W3177052299 @default.
- W4296841186 cites W4226507018 @default.
- W4296841186 cites W4285529816 @default.
- W4296841186 doi "https://doi.org/10.1080/01431161.2022.2122896" @default.
- W4296841186 hasPublicationYear "2022" @default.
- W4296841186 type Work @default.
- W4296841186 citedByCount "0" @default.
- W4296841186 crossrefType "journal-article" @default.
- W4296841186 hasAuthorship W4296841186A5028836596 @default.
- W4296841186 hasAuthorship W4296841186A5046173951 @default.
- W4296841186 hasAuthorship W4296841186A5078116487 @default.
- W4296841186 hasConcept C11413529 @default.
- W4296841186 hasConcept C115961682 @default.
- W4296841186 hasConcept C126780896 @default.
- W4296841186 hasConcept C138885662 @default.
- W4296841186 hasConcept C153180895 @default.
- W4296841186 hasConcept C154945302 @default.
- W4296841186 hasConcept C155512373 @default.
- W4296841186 hasConcept C173163844 @default.
- W4296841186 hasConcept C194257627 @default.
- W4296841186 hasConcept C205372480 @default.
- W4296841186 hasConcept C205649164 @default.
- W4296841186 hasConcept C2776257435 @default.
- W4296841186 hasConcept C2776401178 @default.
- W4296841186 hasConcept C2781137444 @default.
- W4296841186 hasConcept C29081049 @default.
- W4296841186 hasConcept C31258907 @default.
- W4296841186 hasConcept C31972630 @default.
- W4296841186 hasConcept C41008148 @default.
- W4296841186 hasConcept C41895202 @default.
- W4296841186 hasConcept C52622490 @default.
- W4296841186 hasConcept C62649853 @default.
- W4296841186 hasConcept C69744172 @default.
- W4296841186 hasConceptScore W4296841186C11413529 @default.
- W4296841186 hasConceptScore W4296841186C115961682 @default.
- W4296841186 hasConceptScore W4296841186C126780896 @default.
- W4296841186 hasConceptScore W4296841186C138885662 @default.
- W4296841186 hasConceptScore W4296841186C153180895 @default.
- W4296841186 hasConceptScore W4296841186C154945302 @default.
- W4296841186 hasConceptScore W4296841186C155512373 @default.
- W4296841186 hasConceptScore W4296841186C173163844 @default.
- W4296841186 hasConceptScore W4296841186C194257627 @default.