Matches in SemOpenAlex for { <https://semopenalex.org/work/W4224256886> ?p ?o ?g. }
- W4224256886 endingPage "4781" @default.
- W4224256886 startingPage "4753" @default.
- W4224256886 abstract "Abstract For the past few years, image fusion technology has made great progress, especially in infrared and visible light image infusion. However, the fusion methods, based on traditional or deep learning technology, have some disadvantages such as unobvious structure or texture detail loss. In this regard, a novel generative adversarial network named MSAt-GAN is proposed in this paper. It is based on multi-scale feature transfer and deep attention mechanism feature fusion, and used for infrared and visible image fusion. First, this paper employs three different receptive fields to extract the multi-scale and multi-level deep features of multi-modality images in three channels rather than artificially setting a single receptive field. In this way, the important features of the source image can be better obtained from different receptive fields and angles, and the extracted feature representation is also more flexible and diverse. Second, a multi-scale deep attention fusion mechanism is designed in this essay. It describes the important representation of multi-level receptive field extraction features through both spatial and channel attention and merges them according to the level of attention. Doing so can lay more emphasis on the attention feature map and extract significant features of multi-modality images, which eliminates noise to some extent. Third, the concatenate operation of the multi-level deep features in the encoder and the deep features in the decoder are cascaded to enhance the feature transmission while making better use of the previous features. Finally, this paper adopts a dual-discriminator generative adversarial network on the network structure, which can force the generated image to retain the intensity of the infrared image and the texture detail information of the visible image at the same time. Substantial qualitative and quantitative experimental analysis of infrared and visible image pairs on three public datasets show that compared with state-of-the-art fusion methods, the proposed MSAt-GAN network has comparable outstanding fusion performance in subjective perception and objective quantitative measurement." @default.
- W4224256886 created "2022-04-26" @default.
- W4224256886 creator A5034551344 @default.
- W4224256886 creator A5063590181 @default.
- W4224256886 creator A5072820730 @default.
- W4224256886 creator A5084593505 @default.
- W4224256886 date "2022-04-22" @default.
- W4224256886 modified "2023-10-05" @default.
- W4224256886 title "MSAt-GAN: a generative adversarial network based on multi-scale and deep attention mechanism for infrared and visible light image fusion" @default.
- W4224256886 cites W1628236353 @default.
- W4224256886 cites W1677182931 @default.
- W4224256886 cites W1708141795 @default.
- W4224256886 cites W1980382026 @default.
- W4224256886 cites W2054273865 @default.
- W4224256886 cites W2114582993 @default.
- W4224256886 cites W2116702374 @default.
- W4224256886 cites W2183341477 @default.
- W4224256886 cites W2194775991 @default.
- W4224256886 cites W2266694576 @default.
- W4224256886 cites W2474462684 @default.
- W4224256886 cites W2589745805 @default.
- W4224256886 cites W2610070095 @default.
- W4224256886 cites W2624240493 @default.
- W4224256886 cites W2766065559 @default.
- W4224256886 cites W2766070280 @default.
- W4224256886 cites W2767512561 @default.
- W4224256886 cites W2772136803 @default.
- W4224256886 cites W2798018774 @default.
- W4224256886 cites W2809795042 @default.
- W4224256886 cites W2901100349 @default.
- W4224256886 cites W2911345285 @default.
- W4224256886 cites W2912147220 @default.
- W4224256886 cites W2933486688 @default.
- W4224256886 cites W2963446712 @default.
- W4224256886 cites W2963905288 @default.
- W4224256886 cites W2971071255 @default.
- W4224256886 cites W2971204923 @default.
- W4224256886 cites W2980342893 @default.
- W4224256886 cites W2992813042 @default.
- W4224256886 cites W2997240345 @default.
- W4224256886 cites W2998027361 @default.
- W4224256886 cites W3011768656 @default.
- W4224256886 cites W3016744618 @default.
- W4224256886 cites W3023203534 @default.
- W4224256886 cites W3030921250 @default.
- W4224256886 cites W3034681889 @default.
- W4224256886 cites W3034951775 @default.
- W4224256886 cites W3041991648 @default.
- W4224256886 cites W3090665430 @default.
- W4224256886 cites W3102411220 @default.
- W4224256886 cites W3105639468 @default.
- W4224256886 cites W3126855404 @default.
- W4224256886 cites W3138511503 @default.
- W4224256886 cites W3184569536 @default.
- W4224256886 doi "https://doi.org/10.1007/s40747-022-00722-9" @default.
- W4224256886 hasPublicationYear "2022" @default.
- W4224256886 type Work @default.
- W4224256886 citedByCount "13" @default.
- W4224256886 countsByYear W42242568862022 @default.
- W4224256886 countsByYear W42242568862023 @default.
- W4224256886 crossrefType "journal-article" @default.
- W4224256886 hasAuthorship W4224256886A5034551344 @default.
- W4224256886 hasAuthorship W4224256886A5063590181 @default.
- W4224256886 hasAuthorship W4224256886A5072820730 @default.
- W4224256886 hasAuthorship W4224256886A5084593505 @default.
- W4224256886 hasBestOaLocation W42242568861 @default.
- W4224256886 hasConcept C108583219 @default.
- W4224256886 hasConcept C111919701 @default.
- W4224256886 hasConcept C118505674 @default.
- W4224256886 hasConcept C138885662 @default.
- W4224256886 hasConcept C153180895 @default.
- W4224256886 hasConcept C154945302 @default.
- W4224256886 hasConcept C202444582 @default.
- W4224256886 hasConcept C2776401178 @default.
- W4224256886 hasConcept C2779803651 @default.
- W4224256886 hasConcept C31972630 @default.
- W4224256886 hasConcept C33923547 @default.
- W4224256886 hasConcept C41008148 @default.
- W4224256886 hasConcept C41895202 @default.
- W4224256886 hasConcept C52622490 @default.
- W4224256886 hasConcept C76155785 @default.
- W4224256886 hasConcept C94915269 @default.
- W4224256886 hasConcept C9652623 @default.
- W4224256886 hasConceptScore W4224256886C108583219 @default.
- W4224256886 hasConceptScore W4224256886C111919701 @default.
- W4224256886 hasConceptScore W4224256886C118505674 @default.
- W4224256886 hasConceptScore W4224256886C138885662 @default.
- W4224256886 hasConceptScore W4224256886C153180895 @default.
- W4224256886 hasConceptScore W4224256886C154945302 @default.
- W4224256886 hasConceptScore W4224256886C202444582 @default.
- W4224256886 hasConceptScore W4224256886C2776401178 @default.
- W4224256886 hasConceptScore W4224256886C2779803651 @default.
- W4224256886 hasConceptScore W4224256886C31972630 @default.
- W4224256886 hasConceptScore W4224256886C33923547 @default.
- W4224256886 hasConceptScore W4224256886C41008148 @default.
- W4224256886 hasConceptScore W4224256886C41895202 @default.
- W4224256886 hasConceptScore W4224256886C52622490 @default.
- W4224256886 hasConceptScore W4224256886C76155785 @default.