Matches in SemOpenAlex for { <https://semopenalex.org/work/W4387047960> ?p ?o ?g. }
- W4387047960 endingPage "1952" @default.
- W4387047960 startingPage "1952" @default.
- W4387047960 abstract "Aimed at addressing deficiencies in existing image fusion methods, this paper proposed a multi-level and multi-classification generative adversarial network (GAN)-based method (MMGAN) for fusing visible and infrared images of forest fire scenes (the surroundings of firefighters), which solves the problem that GANs tend to ignore visible contrast ratio information and detailed infrared texture information. The study was based on real-time visible and infrared image data acquired by visible and infrared binocular cameras on forest firefighters’ helmets. We improved the GAN by, on the one hand, splitting the input channels of the generator into gradient and contrast ratio paths, increasing the depth of convolutional layers, and improving the extraction capability of shallow networks. On the other hand, we designed a discriminator using a multi-classification constraint structure and trained it against the generator in a continuous and adversarial manner to supervise the generator, generating better-quality fused images. Our results indicated that compared to mainstream infrared and visible image fusion methods, including anisotropic diffusion fusion (ADF), guided filtering fusion (GFF), convolutional neural networks (CNN), FusionGAN, and dual-discriminator conditional GAN (DDcGAN), the MMGAN model was overall optimal and had the best visual effect when applied to image fusions of forest fire surroundings. Five of the six objective metrics were optimal, and one ranked second-to-optimal. The image fusion speed was more than five times faster than that of the other methods. The MMGAN model significantly improved the quality of fused images of forest fire scenes, preserved the contrast ratio information of visible images and the detailed texture information of infrared images of forest fire scenes, and could accurately reflect information on forest fire scene surroundings." @default.
- W4387047960 created "2023-09-27" @default.
- W4387047960 creator A5014995589 @default.
- W4387047960 creator A5045732151 @default.
- W4387047960 creator A5046885612 @default.
- W4387047960 creator A5055035696 @default.
- W4387047960 creator A5065570364 @default.
- W4387047960 creator A5068944442 @default.
- W4387047960 creator A5086997643 @default.
- W4387047960 date "2023-09-26" @default.
- W4387047960 modified "2023-10-18" @default.
- W4387047960 title "Visible and Infrared Image Fusion of Forest Fire Scenes Based on Generative Adversarial Networks with Multi-Classification and Multi-Level Constraints" @default.
- W4387047960 cites W1964641132 @default.
- W4387047960 cites W1997596006 @default.
- W4387047960 cites W2023353513 @default.
- W4387047960 cites W2046404559 @default.
- W4387047960 cites W2048661006 @default.
- W4387047960 cites W2119605622 @default.
- W4387047960 cites W2133665775 @default.
- W4387047960 cites W2146353910 @default.
- W4387047960 cites W2209762097 @default.
- W4387047960 cites W2536878625 @default.
- W4387047960 cites W2559870345 @default.
- W4387047960 cites W2757470902 @default.
- W4387047960 cites W2772136803 @default.
- W4387047960 cites W2779235975 @default.
- W4387047960 cites W2794067776 @default.
- W4387047960 cites W2809795042 @default.
- W4387047960 cites W2889126450 @default.
- W4387047960 cites W2912147220 @default.
- W4387047960 cites W3011768656 @default.
- W4387047960 cites W3087590466 @default.
- W4387047960 cites W3134049832 @default.
- W4387047960 cites W3143068962 @default.
- W4387047960 cites W4200027911 @default.
- W4387047960 cites W4206552868 @default.
- W4387047960 cites W4224629847 @default.
- W4387047960 cites W4366212872 @default.
- W4387047960 doi "https://doi.org/10.3390/f14101952" @default.
- W4387047960 hasPublicationYear "2023" @default.
- W4387047960 type Work @default.
- W4387047960 citedByCount "0" @default.
- W4387047960 crossrefType "journal-article" @default.
- W4387047960 hasAuthorship W4387047960A5014995589 @default.
- W4387047960 hasAuthorship W4387047960A5045732151 @default.
- W4387047960 hasAuthorship W4387047960A5046885612 @default.
- W4387047960 hasAuthorship W4387047960A5055035696 @default.
- W4387047960 hasAuthorship W4387047960A5065570364 @default.
- W4387047960 hasAuthorship W4387047960A5068944442 @default.
- W4387047960 hasAuthorship W4387047960A5086997643 @default.
- W4387047960 hasBestOaLocation W43870479601 @default.
- W4387047960 hasConcept C115961682 @default.
- W4387047960 hasConcept C120665830 @default.
- W4387047960 hasConcept C121332964 @default.
- W4387047960 hasConcept C138885662 @default.
- W4387047960 hasConcept C153180895 @default.
- W4387047960 hasConcept C154945302 @default.
- W4387047960 hasConcept C158355884 @default.
- W4387047960 hasConcept C158525013 @default.
- W4387047960 hasConcept C163258240 @default.
- W4387047960 hasConcept C205649164 @default.
- W4387047960 hasConcept C2779803651 @default.
- W4387047960 hasConcept C2780992000 @default.
- W4387047960 hasConcept C31972630 @default.
- W4387047960 hasConcept C41008148 @default.
- W4387047960 hasConcept C41895202 @default.
- W4387047960 hasConcept C62520636 @default.
- W4387047960 hasConcept C62649853 @default.
- W4387047960 hasConcept C69744172 @default.
- W4387047960 hasConcept C76155785 @default.
- W4387047960 hasConcept C81363708 @default.
- W4387047960 hasConcept C94915269 @default.
- W4387047960 hasConceptScore W4387047960C115961682 @default.
- W4387047960 hasConceptScore W4387047960C120665830 @default.
- W4387047960 hasConceptScore W4387047960C121332964 @default.
- W4387047960 hasConceptScore W4387047960C138885662 @default.
- W4387047960 hasConceptScore W4387047960C153180895 @default.
- W4387047960 hasConceptScore W4387047960C154945302 @default.
- W4387047960 hasConceptScore W4387047960C158355884 @default.
- W4387047960 hasConceptScore W4387047960C158525013 @default.
- W4387047960 hasConceptScore W4387047960C163258240 @default.
- W4387047960 hasConceptScore W4387047960C205649164 @default.
- W4387047960 hasConceptScore W4387047960C2779803651 @default.
- W4387047960 hasConceptScore W4387047960C2780992000 @default.
- W4387047960 hasConceptScore W4387047960C31972630 @default.
- W4387047960 hasConceptScore W4387047960C41008148 @default.
- W4387047960 hasConceptScore W4387047960C41895202 @default.
- W4387047960 hasConceptScore W4387047960C62520636 @default.
- W4387047960 hasConceptScore W4387047960C62649853 @default.
- W4387047960 hasConceptScore W4387047960C69744172 @default.
- W4387047960 hasConceptScore W4387047960C76155785 @default.
- W4387047960 hasConceptScore W4387047960C81363708 @default.
- W4387047960 hasConceptScore W4387047960C94915269 @default.
- W4387047960 hasIssue "10" @default.
- W4387047960 hasLocation W43870479601 @default.
- W4387047960 hasOpenAccess W4387047960 @default.
- W4387047960 hasPrimaryLocation W43870479601 @default.
- W4387047960 hasRelatedWork W2419576664 @default.