Matches in SemOpenAlex for { <https://semopenalex.org/work/W3208126581> ?p ?o ?g. }
- W3208126581 endingPage "103947" @default.
- W3208126581 startingPage "103947" @default.
- W3208126581 abstract "Infrared and visible image fusion is an essential task for multi-sensor image fusion. Generative adversarial networks (GAN) have achieved remarkable performance in the fusion of infrared and visible image. Existing GAN based fusion methods merely using infrared and visible image as input for the fusion, while we found that differential images obtained by subtraction between two image sources could provide contrast information for the fusion. To this end, a novel dual fusion path generative adversarial network (DFPGAN) is proposed in this paper for infrared and visible image fusion. We divided the generator of generative adversarial network into two fusion paths namely infrared–visible path and differential path. The input of infrared–visible path concatenated two image sources to make infrared intensity and texture details keep balance fusion in this path. The input of differential path concatenated differential images obtained by subtraction between two image sources to make contrast information fusion in this path. The features extracted by two fusion paths are concatenated at the end of the generator to generate fused images with contrast effect and balanced information distribution. Meanwhile, we have implemented dual self-attention feature refine module (DSAM) on two fusion paths to refine feature maps in two fusion paths. We adopted switchable normalization layer (SN) substitute for batch normalization layer (BN) in the generator and discriminator to avoid fusion artifact. Furthermore, a mixed content loss is integrated in the generator loss functions to guide the generated image keep balanced information distribution and preserving contrast simultaneously. The adversarial training employed dual adversarial architecture to balance the distribution of infrared intensity and texture details. To verifying the improvement effect of fusion image on target detection, we introduce the Scaled-YOLOv4 target detection framework as evaluation framework, and use the proposed network to fuse RGB images and infrared images for target detection. The results of qualitative and quantitative experiments conducted on public datasets demonstrated the superiority of proposed network over other state-of-the-art methods and could generate fused images with distinctly contrast." @default.
- W3208126581 created "2021-11-08" @default.
- W3208126581 creator A5012604520 @default.
- W3208126581 creator A5068935408 @default.
- W3208126581 creator A5072252134 @default.
- W3208126581 date "2021-12-01" @default.
- W3208126581 modified "2023-10-14" @default.
- W3208126581 title "DFPGAN: Dual fusion path generative adversarial network for infrared and visible image fusion" @default.
- W3208126581 cites W1965739998 @default.
- W3208126581 cites W1990250903 @default.
- W3208126581 cites W2040833130 @default.
- W3208126581 cites W2054273865 @default.
- W3208126581 cites W2059193044 @default.
- W3208126581 cites W2094162745 @default.
- W3208126581 cites W2113494422 @default.
- W3208126581 cites W2133135191 @default.
- W3208126581 cites W2143696753 @default.
- W3208126581 cites W2146353910 @default.
- W3208126581 cites W2266694576 @default.
- W3208126581 cites W2589745805 @default.
- W3208126581 cites W2610070095 @default.
- W3208126581 cites W2735436330 @default.
- W3208126581 cites W2736883612 @default.
- W3208126581 cites W2765838470 @default.
- W3208126581 cites W2798018774 @default.
- W3208126581 cites W2809795042 @default.
- W3208126581 cites W2884585870 @default.
- W3208126581 cites W2912126472 @default.
- W3208126581 cites W2912147220 @default.
- W3208126581 cites W2960405182 @default.
- W3208126581 cites W2963134949 @default.
- W3208126581 cites W2991289865 @default.
- W3208126581 cites W2997019934 @default.
- W3208126581 cites W3016744618 @default.
- W3208126581 cites W3030921250 @default.
- W3208126581 cites W3034261282 @default.
- W3208126581 cites W3035607887 @default.
- W3208126581 cites W3102411220 @default.
- W3208126581 cites W3105639468 @default.
- W3208126581 cites W3143068962 @default.
- W3208126581 cites W3180134609 @default.
- W3208126581 doi "https://doi.org/10.1016/j.infrared.2021.103947" @default.
- W3208126581 hasPublicationYear "2021" @default.
- W3208126581 type Work @default.
- W3208126581 sameAs 3208126581 @default.
- W3208126581 citedByCount "14" @default.
- W3208126581 countsByYear W32081265812022 @default.
- W3208126581 countsByYear W32081265812023 @default.
- W3208126581 crossrefType "journal-article" @default.
- W3208126581 hasAuthorship W3208126581A5012604520 @default.
- W3208126581 hasAuthorship W3208126581A5068935408 @default.
- W3208126581 hasAuthorship W3208126581A5072252134 @default.
- W3208126581 hasConcept C115961682 @default.
- W3208126581 hasConcept C121332964 @default.
- W3208126581 hasConcept C136886441 @default.
- W3208126581 hasConcept C138885662 @default.
- W3208126581 hasConcept C144024400 @default.
- W3208126581 hasConcept C153180895 @default.
- W3208126581 hasConcept C154945302 @default.
- W3208126581 hasConcept C158525013 @default.
- W3208126581 hasConcept C163258240 @default.
- W3208126581 hasConcept C19165224 @default.
- W3208126581 hasConcept C199360897 @default.
- W3208126581 hasConcept C2776401178 @default.
- W3208126581 hasConcept C2777735758 @default.
- W3208126581 hasConcept C2780992000 @default.
- W3208126581 hasConcept C31972630 @default.
- W3208126581 hasConcept C41008148 @default.
- W3208126581 hasConcept C41895202 @default.
- W3208126581 hasConcept C62520636 @default.
- W3208126581 hasConcept C69744172 @default.
- W3208126581 hasConceptScore W3208126581C115961682 @default.
- W3208126581 hasConceptScore W3208126581C121332964 @default.
- W3208126581 hasConceptScore W3208126581C136886441 @default.
- W3208126581 hasConceptScore W3208126581C138885662 @default.
- W3208126581 hasConceptScore W3208126581C144024400 @default.
- W3208126581 hasConceptScore W3208126581C153180895 @default.
- W3208126581 hasConceptScore W3208126581C154945302 @default.
- W3208126581 hasConceptScore W3208126581C158525013 @default.
- W3208126581 hasConceptScore W3208126581C163258240 @default.
- W3208126581 hasConceptScore W3208126581C19165224 @default.
- W3208126581 hasConceptScore W3208126581C199360897 @default.
- W3208126581 hasConceptScore W3208126581C2776401178 @default.
- W3208126581 hasConceptScore W3208126581C2777735758 @default.
- W3208126581 hasConceptScore W3208126581C2780992000 @default.
- W3208126581 hasConceptScore W3208126581C31972630 @default.
- W3208126581 hasConceptScore W3208126581C41008148 @default.
- W3208126581 hasConceptScore W3208126581C41895202 @default.
- W3208126581 hasConceptScore W3208126581C62520636 @default.
- W3208126581 hasConceptScore W3208126581C69744172 @default.
- W3208126581 hasLocation W32081265811 @default.
- W3208126581 hasOpenAccess W3208126581 @default.
- W3208126581 hasPrimaryLocation W32081265811 @default.
- W3208126581 hasRelatedWork W2057200091 @default.
- W3208126581 hasRelatedWork W2102340963 @default.
- W3208126581 hasRelatedWork W2364634124 @default.
- W3208126581 hasRelatedWork W2382607599 @default.
- W3208126581 hasRelatedWork W2419576664 @default.