Matches in SemOpenAlex for { <https://semopenalex.org/work/W4292977151> ?p ?o ?g. }
- W4292977151 endingPage "116855" @default.
- W4292977151 startingPage "116855" @default.
- W4292977151 abstract "Underwater robots have broad applications in many fields such as ocean exploration, ocean pasture and environmental monitoring. However, due to the inference of light scattering and absorption, selective color attenuation, suspended particles and other complex factors in the underwater environment, it is difficult for robot vision sensors to obtain high-quality underwater image signal, which is the bottleneck problem that restricts the visual perception of underwater robots. In this paper, we propose a multi-scale fusion generative adversarial network named Fusion Water-GAN (FW-GAN) to enhance the underwater image quality. The proposed model has four convolution branches, these branches refine the features of the three prior inputs and encode the original input, then fuse prior features using the proposed multi-scale fusion connections, and finally use the channel attention decoder to generate satisfactory enhanced results. We conduct qualitative and quantitative comparison experiments on real-world and synthetic distorted underwater image datasets under various degradation conditions. The results show that compared with the recent state-of-the-art underwater image enhancement methods, our proposed method achieves higher quantitative metrics scores and better generalization capability. In addition, the ablation study demonstrated the contribution of each component. • We propose a multi-scale fusion generator network architecture. Based on the analysis of the underwater environment, an adaptive fusion strategy is proposed to fuse the multi-source and multi-scale features, which can effectively correct the color casts and haze of the image and improve its contrast, it can also avoid blind enhancement of the image and improve the generalization capability of the model. • We propose a decoder model combined with channel attention to compute the attention of the prior and decoded feature maps in the fusion process and adjust them adaptively. The aim is to learn the potential associations between the priori features of fusion and the enhanced results. • We conducted qualitative and quantitative evaluations and compared FW-GAN with traditional methods and state-of-the-art models. The results show that FW-GAN has good generalization capability and competitive performance. Finally, we conduct an ablation study to demonstrate the contribution of each core component in our network." @default.
- W4292977151 created "2022-08-24" @default.
- W4292977151 creator A5002358964 @default.
- W4292977151 creator A5019779405 @default.
- W4292977151 creator A5026131126 @default.
- W4292977151 creator A5048192527 @default.
- W4292977151 creator A5057893502 @default.
- W4292977151 creator A5085222650 @default.
- W4292977151 date "2022-11-01" @default.
- W4292977151 modified "2023-10-16" @default.
- W4292977151 title "FW-GAN: Underwater image enhancement using generative adversarial network with multi-scale fusion" @default.
- W4292977151 cites W1971693194 @default.
- W4292977151 cites W2009071067 @default.
- W4292977151 cites W2091420866 @default.
- W4292977151 cites W2107858703 @default.
- W4292977151 cites W2133665775 @default.
- W4292977151 cites W2151103935 @default.
- W4292977151 cites W2181646778 @default.
- W4292977151 cites W2194775991 @default.
- W4292977151 cites W2293581118 @default.
- W4292977151 cites W2474516010 @default.
- W4292977151 cites W2523532944 @default.
- W4292977151 cites W2617758397 @default.
- W4292977151 cites W2752782242 @default.
- W4292977151 cites W2763503841 @default.
- W4292977151 cites W2798807298 @default.
- W4292977151 cites W2866634454 @default.
- W4292977151 cites W2886075188 @default.
- W4292977151 cites W2963073614 @default.
- W4292977151 cites W2963182372 @default.
- W4292977151 cites W2966516593 @default.
- W4292977151 cites W2971483169 @default.
- W4292977151 cites W2990176100 @default.
- W4292977151 cites W2999811308 @default.
- W4292977151 cites W3006777311 @default.
- W4292977151 cites W3042930711 @default.
- W4292977151 cites W3094046575 @default.
- W4292977151 cites W3129035913 @default.
- W4292977151 cites W3153781409 @default.
- W4292977151 cites W3153844346 @default.
- W4292977151 cites W3159660641 @default.
- W4292977151 cites W3195744835 @default.
- W4292977151 cites W3201231642 @default.
- W4292977151 cites W4220767255 @default.
- W4292977151 doi "https://doi.org/10.1016/j.image.2022.116855" @default.
- W4292977151 hasPublicationYear "2022" @default.
- W4292977151 type Work @default.
- W4292977151 citedByCount "7" @default.
- W4292977151 countsByYear W42929771512022 @default.
- W4292977151 countsByYear W42929771512023 @default.
- W4292977151 crossrefType "journal-article" @default.
- W4292977151 hasAuthorship W4292977151A5002358964 @default.
- W4292977151 hasAuthorship W4292977151A5019779405 @default.
- W4292977151 hasAuthorship W4292977151A5026131126 @default.
- W4292977151 hasAuthorship W4292977151A5048192527 @default.
- W4292977151 hasAuthorship W4292977151A5057893502 @default.
- W4292977151 hasAuthorship W4292977151A5085222650 @default.
- W4292977151 hasConcept C111368507 @default.
- W4292977151 hasConcept C115961682 @default.
- W4292977151 hasConcept C127313418 @default.
- W4292977151 hasConcept C138885662 @default.
- W4292977151 hasConcept C153180895 @default.
- W4292977151 hasConcept C154945302 @default.
- W4292977151 hasConcept C158525013 @default.
- W4292977151 hasConcept C205649164 @default.
- W4292977151 hasConcept C2778755073 @default.
- W4292977151 hasConcept C2988773926 @default.
- W4292977151 hasConcept C31972630 @default.
- W4292977151 hasConcept C37736160 @default.
- W4292977151 hasConcept C39890363 @default.
- W4292977151 hasConcept C41008148 @default.
- W4292977151 hasConcept C41895202 @default.
- W4292977151 hasConcept C58640448 @default.
- W4292977151 hasConcept C69744172 @default.
- W4292977151 hasConcept C98083399 @default.
- W4292977151 hasConceptScore W4292977151C111368507 @default.
- W4292977151 hasConceptScore W4292977151C115961682 @default.
- W4292977151 hasConceptScore W4292977151C127313418 @default.
- W4292977151 hasConceptScore W4292977151C138885662 @default.
- W4292977151 hasConceptScore W4292977151C153180895 @default.
- W4292977151 hasConceptScore W4292977151C154945302 @default.
- W4292977151 hasConceptScore W4292977151C158525013 @default.
- W4292977151 hasConceptScore W4292977151C205649164 @default.
- W4292977151 hasConceptScore W4292977151C2778755073 @default.
- W4292977151 hasConceptScore W4292977151C2988773926 @default.
- W4292977151 hasConceptScore W4292977151C31972630 @default.
- W4292977151 hasConceptScore W4292977151C37736160 @default.
- W4292977151 hasConceptScore W4292977151C39890363 @default.
- W4292977151 hasConceptScore W4292977151C41008148 @default.
- W4292977151 hasConceptScore W4292977151C41895202 @default.
- W4292977151 hasConceptScore W4292977151C58640448 @default.
- W4292977151 hasConceptScore W4292977151C69744172 @default.
- W4292977151 hasConceptScore W4292977151C98083399 @default.
- W4292977151 hasLocation W42929771511 @default.
- W4292977151 hasOpenAccess W4292977151 @default.
- W4292977151 hasPrimaryLocation W42929771511 @default.
- W4292977151 hasRelatedWork W2901368259 @default.
- W4292977151 hasRelatedWork W3024419957 @default.